• [deleted]

Robert,

The problem with deduction is that is is an assumption, based on experience, that all swans are white.

With induction, only white swans are encountered, so the assumption arises that all swans must be white. It is only when we forget deduction arises from experience, that we start to think there must be some fundamental law which requires swans to be white.

I agree all we have to work with is assumptions. The question is which assumptions best explain the evidence, ie, Ockham's razor. That is a bit of a trial and error process. Even our gut, using immune responses evolved over generations of interaction with the environment, makes assumptions as to what is safe and what is not. So while some may react to a toxin, others may not.

John,

Assuming no gross errors in logic have been made, then:

For deduction: If the premises are certainly true, then the conclusion is certainly true.

For Induction: If the premises are certainly true, then the conclusion is probably true.

For Abduction: If the premises are certainly true, then the conclusion is possibly true.

It is not quite correct to say that deduction is based on an assumption. The issue of whether or not the initial premises, assumptions etc. are true, is quite independent of the fact that the reliability of the conclusions, generated by the three methods, are not equal.

Consequently, in any well reasoned argument, based on deduction, if the conclusion is wrong, it must be because an initial premise was wrong. The same cannot be said of Induction and Abduction.

In other words: Initial premises and assumptions are generated, via reasoning based on induction and abduction only, deduction plays no role. Then and only then, deduction may be applied to these premises to generate some conclusions.

  • [deleted]

Robert,

"Consequently, in any well reasoned argument, based on deduction, if the conclusion is wrong, it must be because an initial premise was wrong."

Garbage in, garbage out.

The question is as to what assumptions and premises are open to question. What is today's version of the geocentric premise?

I've argued it's that treating time as a measure emphasizes the effect of sequence, rather than the cause of action, but there are lots of other aspects of physics, which upon closer examination, seem to crumble to dust, or sprout wings and fly off in the most unexpected directions, as expressed in these many essays. I would say, though, that the future is not where the information points, but where the energy flows. Too often, I find, information points to itself, when visionaries give way to craftsmen, who are more concerned with the craft, than the bigger picture. Philosophy, religion, politics, economics, etc. Why would physics be any different? The "It from bit" syndrome has to go, as you so well argue.

Yet gloves have other characteristics than handedness: number of fingers, material composition, stitching, etc. Can we ever be sure we have accounted for all properties? How can we distinguish between the reduced information of the quantum realm, compared to the classical realm, and the sort of limited information that belongs to conceptual systems that are simply defined in that way. In other words, how to distinguish the real attributes of gloves from those, like handedness, we have defined or specified?

Hi Robert:

Thanks for your reply.

Please also respond to my second comment as to how I address the problem of "memory-less" particles in my posted paper -- -" From Absurd to Elegant Universe".

I would greatly appreciate your comments and rating on my paper.

Thanks

Avtar

Dan,

You asked: "In other words, how to distinguish the real attributes of gloves from those, like handedness, we have defined or specified? "

First, Handedness is a real attribute of a glove. But more importantly, real attributes will always be determined to have the same "value" no matter how you look at it. If you look at a glove from any angle, it will always appear to have the same handedness, either left or right. An object like a coin is different. If you view a coin from different angles, it may change "state" from Heads to Tails.

The "state" of a coin is not an attribute of the coin. It is an attribute of the relationship between the coin and the observer.

You also asked: "Can we ever be sure we have accounted for all properties?" No. We cannot. We don't know what we don't know. On the other hand, we do know what we do know. That fact can be exploited in many very interesting and sophisticated ways. At the very least, it should be exploited, as in the case of "spin", to recognize that the observed phenomenon, "spin states", behave more like the "state" of a coin, than the state of an actual attribute of the particle.

  • [deleted]

Robert,

You stated: "In a very real sense, Newton's law of gravity is perfectly analogous to the JPEG image compression algorithm. It is a "lossy" description of the original data, no more no less; the reconstructed, predicted "image" is slightly different from the original. In contrast, Einstein's theory of gravity appears to be a "lossless" compression algorithm."

I think that this sentence might be counter to everything else you are stating in your essay. By definition, the process of anti-differentiation (which Newton used to map gravitational force into his law of gravity) is a reduction of possible information into one interpretation of the data and which likely has passed into GR (i.e. still lossy). My essay has some questions but a sketch I just posted may be a simpler visual explanation.

Regards,

Jeff

  • [deleted]

Hi Robert,

Being an old copywriter, I cannot help to try to catch yor wonderful essay in a one-liner: "Confusing Mathematics for Physics is to Complicate the Simple and Simplify the Complex."

We are several authors in this contest who - from different points of wievs - question the role of mathematics in physics. Mathematics has for long been its lingua franca and the "shut up and calcilate" promotors are not just a few. The funny thing is that during the same period of time, since the mid seventies, when calculation capacity has been exponentially increasing, physical theory-building has been comparatively meagre. More is less, it seems.

Best regards,

Inger

Robert

"The speed of light is not directly observable; it is not an observable phenomenon at all. There is always a "privileged" observer."

"...When a light wave is first created, it is created in the reference frame of this privileged observer. It is created at the frequency observed by that observer, not the Doppler shifted frequency of an observer in a different frame of reference. And the privileged observer is always at rest with respect to itself."

"...Hence, when all other observers transform their actual observables to the privileged observer's frame, they too must infer the same constant speed of light."

These phrases and meanings are precisely common to our essays. But I'm awestruck by your logical analysis of mathematical limits. I abandoned the study of maths from intuition about it's shortcomings modelling reality, for which you now give the precise reasons, beautifully argued and written. I now far better understand WHY the Fourier transform fails, though I even wrote of FM receiver mechanisms some time ago.

My route to this end was logic and ontology, also observational from optics and as an astronomer. I dare to suggest I've also pushed a little further than you, in maverick style, to find curved space-time and even pre big bang conditions. Though I hazard that you too report far from all your findings.

I hope you'll find time to read my essay. Quite different to yours as it simply analyses the mechanistic evolution of real systems without abstraction, avoiding the limitations you so brilliantly identify. I'd like to cite you, but saw no references. Do you have anything on this published?

There are a couple of areas we diverge, and I'd like to scrutinise those, along with a couple of new kinetic considerations I consider, such as the relationship of f and wavelength for different observer frames, and the severe limitation I derive for spatial limits of the emitters frame.

Very Best wishes

Peter

    Sorry, I didn't mean to say that handedness is not a real property, only that it is not the ONLY property of gloves. The question remains for me: why do quantum entities possess/reveal so little information?

    • [deleted]

    Dan,

    I totally agree with you....

    Dan, you asked:

    "Why do quantum entities possess/reveal so little information?"

    Why does a coin have only two sides? Why not?

    Lack of information is equal to lack of complexity; It is simple. And simple objects can be described via a simple set of symbols, ultimately just a single bit.

    So your question boils down to "Why do such simple entities exist?" Neither I nor anyone else can offer anything other than pure speculation, for an answer. Here is mine:

    Simple entities exist so that complex entities can evolve, as combinations of them.

    Peter,

    Most of the basic ideas in my essay can also be found in a book I wrote, twenty years ago:

    "Human and Machine Intelligence: An Evolutionary View"

    After reading Roger Penrose's book, "The Emperor's New Mind", I was inspired to write my own book as a rebuttal to his. His was a best seller, mine fell into a black-hole shortly after being published. Used copies can still be found on amazon and other on-line book sellers.

    The book is mostly about the nature of intelligence, viewed as an outgrowth of sensory signal processing. The quantum aspects came about, in order to refute Penrose's claim that intelligence is probably the result of some as yet undiscovered quantum oddity.

    Robert

    I also disagree we have any intelligence, as a result of Penrose's quantum oddity or otherwise! I can however see both sides of that one. Having arrived at a recycling cosmology with largely (but not solely) re-ionized matter there is some certainty that with infinitely many recyclings some particles in of our brains may have originally been part of the brain of some intelligent being! Pretty crazy stuff I know, but I use rigorous logic, from where it emerges as implicit! So there may be hope for us yet. Of course mechanistically I agree entirely with you, which I'd expect is 99.99999% of the game. (I wonder if homeopathy really works, we may find out in the UK as our new Health Secretary is a fan!)

    I'll track down a copy of your book. I'm sure it's more readable than Penrose. I'd also like to cite it, or perhaps your essay, in a current paper. I wasn't implying anything by my comment except the 'great minds think alike' commonality of our conclusions, and coming from such very different directions must imply some fundamental veracity. I've found some massive importance in your truths, touched on in my essay.

    I do hope you'll get to read it, can find and extract the real gold nuggets, and give me your views. I also need a bit of maths thrown at the concepts as I'm averse to going too near the stuff myself.

    Well done for yours. Certainly a top score coming from me I think. Have you read Tom Ray's yet, also nicely debagging Bell a la Joy Christian.

    Peter

    • [deleted]

    P S

    Dear Robert, please don't mistake my oneliner as a disrespectful and slipshod simplification of the complexity of your essay! Should it be so, i apoligize. D S

    As four your old book, I´will try to find an purchase it. Not only to be able to better understand your essay, but also because there might be a possible connection with my PhD thesis, which deals with the contrast/controversy between the simoplifications of artificial intelligence and the complexity of human knowledge.

    • [deleted]

    Hi Robert,

    You were saying:

    "If you look at the relations given for the Uncertainty Principle and Shannon's Capacity, for the single particle case mentioned, in which S/N =1, then the uncertainty principle boils down to the statement that "1 = maximum number of bits of information that can be extracted from an observation, in the worst case."

    Duh

    So what is the big deal? What makes this so significant?"

    If I'm understanding this correctly, and my calculation is right, you're saying that the boundary between the quantum and classical world is fuzzy, and that the commutator vanishes when the signal (particle count) increases and particle count error stays constant.

    So, if I had a ball of 1e200 fundamental particles (and I know this count precisely because I know their individual masses and I weighed the ball real good, etc, etc), this would give me a commutator with the value of 0.001, not 1. So, I would be 999/1000th into the classical regime.

    To me, that is an absolutely fantastic calculation, because:

    1) It makes sense to me even from a purely classical point of view: http://www.phys.unsw.edu.au/jw/uncertainty.html

    2) As far as I can tell it's novel, because I read about theoretical physicists humming and hawing about what may or may not constitute the boundary between the quantum and classical regime all of the time, in not so specific language as yours.

    Holy cow dude!

      • [deleted]

      (yes, I realize that particle count is not a constant, but I'm sure there is a slick way to calculate a mean based on the various different field configurations... the point is that you totally just blew my mind)

      • [deleted]

      No, I believe I had it wrong just then.

      I believe that in your formula for the commutator relationship

      [math]\delta t \cdot \delta f = \frac{1}{\log_2(1 S/N)},[/math]

      S seems to be the actual particle count, and N seems to be idealized particle count. For instance, if you had all kinds of particles bunched together and considered them to be a single baseball, then N = 1. On the other hand, if you consider every particle individually, then N = S (the actual particle count). As S/N goes to 1, a transition from the classical to the quantum regime occurs. I believe this to mean that when S is held constant, then a token amount of random quantum noise is added into the system whenever N is incremented by 1.

      Maybe you could talk to S Hossenfelder about this. She has some very interesting things to say about gravity and singularities and a vanishing Planck action (h; the main commutator coefficient). Perhaps this is the mechanism in question. I believe that string theory kind of says that black holes are "one" particle, insomuch that they are a continuation of the particle spectrum, and so your calculation would probably apply to both electrons and black holes -- in a unified kind of way.

      Altogether, I think that it's extremely impressive. I do believe it when you say that it's a mathematical truth, and also the physical interpretation is pretty convincing.

      There is a much simpler way of looking at the Shannon Capacity relation:

      dt specifies the duration of the observation

      df specifies that number of samples per second that must be taken to preserve the entire information content of the continuous, band-limited signal being observed.

      The log term specifies the number of bits needed, within each sample (it is counting bits, that is why it is always given as base-2) to preserve all the information. Obtaining more bits per sample would not significantly increase the total number of recoverable bits of information.

      So the Shannon Capacity boils down to the statement that the MAXIMUM number of recoverable bits of information, cannot exceed the number of bits required to "digitize" the signal in the first place.

      The uncertainty principle, at the other end, boils down to the statement that the MINIMUM number of recoverable bits of information, corresponds to a signal that can be "digitized" with exactly one sample, with exactly one bit per sample = one bit.

      If you obtain fewer bits than that, then you have failed to make any observation at all.

      So, of course the uncertainty principle represents a fundamental limit. The point is that the limit is true by definition of what is meant by a bit of information. It says nothing at all "interesting" about the nature of reality. Heisenberg mistakenly thought he had discovered some deep, underlying mystery of nature. In fact, he merely discovered a very peculiar way of restating the definition of a bit of information.

      Rob McEachern

      • [deleted]

      Hi Rob,

      Thanks for responding with some clarifications. Perhaps I can see now why you have such an interest in Lorraine's essay.

      - Shawn