While this article is a breath of fresh air, recognizing the role of Ashby's law of requisite variety in control systems, entanglement spoils the picture.

Bar-Yam understood that requisite variety is a theorem in complex systems science, re-introducing classical logic. Dynamic centrality preserves locality via negative feedback.

From my 2007 conference paper "Time, Change and Self-organization:"

"3.2. Because each discrete sequential event has a finite range, information boundaries should correspond to cardinal directions of 3-space for a 6-dimensional , 2-point boundary, finite analysis. [Casti, 1996] A model of dynamic centrality [Braha--Bar- Yam, 2006] in which dominant nodes exchange position continuously, reveals that high network connectivity is sensitively dependent on time. To exploit this characteristic, in order to extract accurate information about a present action from a future state, one treats the network as a self organized system exhibiting infinite self similarity--each interval in which a singularity forms is a new initial condition. Because we now know, as a result of Perelman's proof of the Poincare' Conjecture, that singularities of the topological positively curved 3-manifold are extinguished only in finite time [Anderson, 2004]--then if time is an n-dimensional infinitely orientable metric on a self-avoiding random walk, a network of random-output computers ( "calculating machines") corresponds to quantum time intervals randomly orienting in an infinite dimensional (Hilbert) space--in which the principle of self-similarity forces an ordered direction of continuous time in the limits of the 3-manifold. [Ray. 2006]"

The [link:home.comcast.net/~thomasray1209/ICCS2007PP.ppt] accompanying slide presentation. [/link}

    H=-K log p or S=K log p and the informations are ?

    redundance and incertitude after all, the aim is to show what are the reaal events with the good sortings.

    I think I become crazzy , I amgoing totake my meds.Perhaps I will be less parano.Oh My God .

    There is much that is true about this article...

    "The book, actually first published in 1956, was W. Ross Ashby's An Introduction to Cybernetics, which outlined a framework for unifying the sciences--the classical disciplines, that is--in terms of how systems can be controlled.

    ...That brings us back to that pivotal Friday afternoon in Nottingham. In An Introduction to Cybernetics, Ashby laid out the Law of Requisite Variety, which states that to keep a system stable, a control mechanism must have at least as many states as the system does."

    and then there are a few statements that are problematic...

    "That kind of correlation can't happen in the classical world--once set in motion, one ball's spin doesn't affect any other's--suggesting the need for a new, quantum cybernetics to understand quantum systems."

    The spins of two classical balls are actually perfectly correlated in a deterministic universe and therefore perfectly controllable. It is phase coherence that classical cybernetics lacks since determinism implies coherence for all spins. Quantum spins can be either coherent or incoherent and so it is really the incoherence of quantum spins that differentiates quantum from classical, not their coherence.

    That is, all spins remain coherent with all other spins in a determinate classical universe since all of the information is knowable from initial conditions and all spins are therefore knowable. The decay of quantum phase coherence means that while quantum spins can be coherent, the decay of quantum phase coherence into incoherence can result in unknowable spin states.

    Unknowable spin states are not possible to control and must simply be accepted as the way the universe is. The information of phase decoherence is not lost, it is rather given up to the same pure dephasing of the shrinking universe of which we are a part. The dephasing of the quantum universe represents information that is unknowable because we are part of the same dephasing.

    Although quantum algebra often incorporates time-averaged solutions that appear instantaneous, there is no such thing as an instantaneous quantum change.

    Thomas Howard Ray wrote on Oct. 2, 2015 @ 15:55 GMT as "We cannot reconcile quantum mechanics with classical mechanics, because quantum events don't happen continuously in time; what we observe is purely interpreted in a mathematical model, a geometric event-space of discontinuous, statistical functions ("rolls of the dice.")."

    A photon is a complicated superposition of frequencies or wavelets that couples into a complicated superposition of states between quantum objects. It is a large number of discrete exchanges of matter and phase that couples photons with states among objects. It is from the discrete time delays of those discrete matter exchanges that we imagine a continuous time and space and continuous motion in that continuous time and space.

    However, the nature of quantum reality is of discrete matter exchanges and discrete time delays. Thus your thesis of discrete information exchange as a basis for time is a valid one as long as you reconcile quantum dephasing that represents information that is lost to the dephasing of the universe and therefore unknowable information, i.e., rolls of the dice.

    I agree, Steve. I don't believe, however, that we imagine continuous spacetime and continuous motion.

    I assume continuity by the principle of comprehensibility. No such principle attends quantum mechanics, which is why it must fill the gap with ad hoc assumptions.

    So long as one assumes a terra incognita of unknowable knowledge, there is no escape from the existential dilemma. Using Chaitin's discovery of maximal unknowability, one can recover continuity system wide -- i.e., uncomputable functions select for efficiency. What is not efficiently configured feeds back into system as negative feedback, and nothing is wasted, nothing ultimately unknowable.

    This is the meaning of time dependency, with multi-scale variety.

    Steve wrote,

    "A photon is a complicated superposition of frequencies or wavelets that couples into a complicated superposition of states between quantum objects." A photon may be described as being a superposition, but a photon is not a superposition. A superposition is merely one possible mathematical description of a photon. Communications Engineers use an entirely different type of mathematical description, to describe things like photons; one based on modulation theory, rather than superposition theory. Unlike most physicists, comms engineers are familiar with both types of descriptions, and when the use of one, is more appropriate than the use of the other. Consequently, physicists are constantly mistaking the properties of their preferred, but poorly chosen description, for being properties of the entities being described. But there is no such one-to-one correspondence, in most cases. Assuming that there is, has been the source of much of the supposed "weirdness" in QM.

    Rob McEachern

    Robert,

    Superb Observation! and brilliantly illuminated by your prose. Nice! jrc

    Robert,

    I largely agree with you. Let me just add that already application of complex Fourier transformation requires the unphysical assumption of frozen block time that spans from minus to plus infinity.

    When the fathers of QM like Dirac meant that there are only positive frequencies they didn't understand this. Hopefully you are willing to check my arguments although they are scattered over many essays.

    ++++

    Here's another perspective on the applications of quantum-like effects to complex systems, without using entanglement:

    https://www.researchgate.net/publication/254630915_The_Problem_of_Quantum-Like_Representation_in_Economy_Cognitive_Science_and_Genetics?showFulltext=true

    If entanglement falls, as a fundamental principle, dynamic centrality remains.

    Which opens up insight into the meaning "local realism". What is local on one scale is not necessarily local on another. Thus, requisite variety reduces to multi-scale variety. Without magic.

    23 days later
    • [deleted]

    Vacuum state

    In quantum field theory, the vacuum state (also called the vacuum) is the quantum state with the lowest possible energy.

    (...)

    If the quantum field theory can be accurately described through perturbation theory, then the properties of the vacuum are analogous to the properties of the ground state of a quantum mechanical harmonic oscillator (or more accurately, the ground state of a QM problem). In this case the vacuum expectation value (VEV) of any field operator vanishes. For quantum field theories in which perturbation theory breaks down at low energies (for example, Quantum chromodynamics or the BCS theory of superconductivity) field operators may have non-vanishing vacuum expectation values called condensates. In the Standard Model, the non-zero vacuum expectation value of the Higgs field, arising from spontaneous symmetry breaking, is the mechanism by which the other fields in the theory acquire mass.

    (...)

    Symmetry

    For a relativistic field theory, the vacuum is Poincaré invariant, which follows from Wightman axioms but can be also proved directly without these axioms.[9] Poincaré invariance implies that only scalar combinations of field operators have non-vanishing VEV's. The VEV may break some of the internal symmetries of the Lagrangian of the field theory. In this case the vacuum has less symmetry than the theory allows, and one says that spontaneous symmetry breaking has occurred. See Higgs mechanism, standard model.

    https://en.wikipedia.org/wiki/Vacuum_state

    Reality finds itself to exist at the lowest possible energy level or ground state (or as described by perturbation theory to be the ground state of the Quantum Mechanical Harmonic Oscillator) and may or may not escape itself at this level depending on whatever causally influences its mass (Higgs field) and energy level.

    See: http://www.sciforums.com/threads/reality-is.152528/page-10

    • [deleted]

    Thank you Steve.

    8 days later
    • [deleted]

    Some of the reasoning you have in play there, seems - on the face of things anyway - problematic. Cybernetics, when it first came out, had the new distinctiveness on its own terms, yes, but well below any 'stand alone' standard. It did not stand on its own in terms of new value. But it was an excellent illustration, for the 1950's audience, of the possibilities in the 'shape of things to come'. In that co-dependent combination cybernetics was a development with a degree of historical import.

    But not on its own. Cybernetics didn't really bring any new discoveries. It was a vision for not how the sciences would be reformed and CORRECTED in the scientific sense of DISCOVERY. Cybernetics made no predictions and moreover, within itself had no theoretical breakthroughs capable of seeding predictions and discoveries in the future. Cybernetics was a vision for how the sciences could be integrated and their developments synchronized with a vision for how, from this the forces of discovery would generate, driving emergent processes that would up-level via "meta stasis transition" events transforming syncronicity to orchestration, to Beethoven's catch 22nd.

    It was a vision, and in the 1950's it seemed like it had to be amazingly true. But it did not contain anything genuinely new; it didn't advance our understanding. It deserved the meteoric rise it got for being avante garde for the main part of the population. Not, though for the thousands of brilliant scientific personnel returning to civilian life after orchestrating a sycrhonized strategy replete with brilliant invention and discovery, to run a world war.

    Your average Enterprise Resource Planning, ERP package of the 1990's, basically supply chain management with clever would make cybernetics look like a shaggy dog story.

    So summing up in brief, you've got this cybernetics thing that would add value to quantum solutions by matching all the states.

    But the challenge in QM is understanding and getting better theory and control of ....states.

    Which cybernetics would totally do for you, once the states and the theoretical and empirical work is all concluded susccessfully for states that can be controlled.

    Once that is in the bag, cybernetics can be the thing that implements by matching all the states and controlling the states by incorporating theory and technology that you are going to discover in the future.

    But cybernetics doesn't add value to the challenge of discovery. And for that reason it's a passive player, and nothing is different than if you didn't mention it at all. The scientific problem is just the same. Samsung will build you a control system when the heavy lifting is done. Sorry to be so negative...but there it is.

    Some of the reasoning you have in play there, seems - on the face of things anyway - problematic. Cybernetics, when it first came out, had the new distinctiveness on its own terms, yes, but well below any 'stand alone' standard. It did not stand on its own in terms of new value. But it was an excellent illustration, for the 1950's audience, of the possibilities in the 'shape of things to come'. In that co-dependent combination cybernetics was a development with a degree of historical import.

    But not on its own. Cybernetics didn't really bring any new discoveries. It was a vision for not how the sciences would be reformed and CORRECTED in the scientific sense of DISCOVERY. Cybernetics made no predictions and moreover, within itself had no theoretical breakthroughs capable of seeding predictions and discoveries in the future. Cybernetics was a vision for how the sciences could be integrated and their developments synchronized with a vision for how, from this the forces of discovery would generate, driving emergent processes that would up-level via "meta stasis transition" events transforming syncronicity to orchestration, to Beethoven's catch 22nd.

    It was a vision, and in the 1950's it seemed like it had to be amazingly true. But it did not contain anything genuinely new; it didn't advance our understanding. It deserved the meteoric rise it got for being avante garde for the main part of the population. Not, though for the thousands of brilliant scientific personnel returning to civilian life after orchestrating a sycrhonized strategy replete with brilliant invention and discovery, to run a world war.

    Your average Enterprise Resource Planning, ERP package of the 1990's, basically supply chain management with clever would make cybernetics look like a shaggy dog story.

    So summing up in brief, you've got this cybernetics thing that would add value to quantum solutions by matching all the states.

    But the challenge in QM is understanding and getting better theory and control of ....states.

    Which cybernetics would totally do for you, once the states and the theoretical and empirical work is all concluded susccessfully for states that can be controlled.

    Once that is in the bag, cybernetics can be the thing that implements by matching all the states and controlling the states by incorporating theory and technology that you are going to discover in the future.

    But cybernetics doesn't add value to the challenge of discovery. And for that reason it's a passive player, and nothing is different than if you didn't mention it at all. The scientific problem is just the same. Samsung will build you a control system when the heavy lifting is done. Sorry to be so negative...but there it is.

    a month later
    • [deleted]

    seems related to the work on top down causation of sara walker and paul davies. fascinating stuff.

    9 days later

    Diederik Aerts and his colleagues at the VUB and elsewhere have been pursuing a not dissimilar approach (applying QM principles isomorphically to macroworld social and other issues) for a couple of decades. Just noting.

    One concern. All this seems to be edging fairly close to a ToE or -- not contradictory -- something akin to empirical proof that P=NP. Or am I fundamentally misreading?

    Write a Reply...