A timely discussion!!... that demonstrates the critical nature of appropriate application of mathematical methodologies.
It has been more than 70 years since Albert Einstein posed the question:"Can a field theory which describes exhaustively physical reality, including four-dimensional space, be specified directly?"... and he did not concur with the then and still prevailing conviction that the experimentally assured duality of nature... i.e. corpuscular and wave structure... can be realized only in conformity with present quantum theory... i.e.__**"...in an indirect way, by a statement of the statistics of the results of measurement attainable on the system."_
"I think that such a far-reaching theoretical renunciation is not for the present justified by our actual knowledge, and that one should not desist from pursuing to the end the path of the relativistic field theory."~ A. Einstein
REF: "Albert Einstein Relativity: The Special and General Theory" 17th Edition Crown Publishers 1961 pg. 157
If "system" is defined as being "... composed of well-defined components such as numbers and operators", and the degree to which a component is "well-defined" is as a consequence of its resolve as a minimum/indivisible constituent of the system being analyzed, then the mathematical methodologies utilized to derive objectifiable minimum/indivisible elements of a Space-Time Energy emergence, are critical to definition validity of all subsequent systems... i.e. all subsequent system components are comprised of the most fundamental elements of a Space-Time Energy emergent system analysis.
As a consequence of investigations of Space-Time Energy emergence, from a single point sourced emission, of minimum/indivisible units of spatially defined Energy (QE), within a non-perturbative QE distribution framework, that defines minimum/indivisible spatial units (QI) of QE occupancy, the emergence of Space-Time as a component of a universe comprised of a Spaceless-Timeless Logic Domain and a Space-Time Logic Domain, has been found to be neither random nor determinant... i.e. in that it must be solved on every Q-Tick of the pulsed source momentum mechanism, it is determinate as a consequence of resolve.
Given a a non-perturbative distribution framework as a networked intelligence... i.e. a computing resource... the dynamics of QE occupancy of QI can be Spontaneously, Harmoniously Resolved (SHR)... i.e. NEXT occurs.
Given a single point pulse sourced emission of spatially defined units of substance... i.e. system components... and a non-perturbative distribution structure... i.e. system operative intelligence... on pulse close of every Q-Tick of the momentum mechanism, the system could theoretically be considered deterministic, and with the universally networked intelligence as one's computational assistant, future QE distribution can be resolved 1 Q-Tick at a time.
Whether "... the behavior of a system can be inferred from the behavior of the components" is only verifiable to the degree of component analysis reduction.
Whether "... the behavior of the components may be inferred from the behavior of the system as a whole", is only verifiable to the degree of system operative analysis reduction.
Within a container of gas, temperature is a measurement of motion, but an inability to resolve the contribution of internal molecular motion to the temperature of the whole, does not infer that "temperature is not a property of the molecular component..." ... i.e. an analysis reduction of molecules exposes internal molecular dynamics/motion.
Spin path dictates at QE scale... i.e. dynamics of minimum/indivisible units of spatially defined Energy... induce observationally imperceptible distance/time variations that invalidate application of observed distance/time parameters as justification to categorize gravitational systems as being deterministic... i.e. gravitational systems are deterministic as a consequence of a resolvable/deterministic Space-Time Energy emergence, not as a consequence of observable distance/time relationships.
Admittedly it would require massive computational resources, but flipping a coin within a non-perturbative analysis environment is theoretically deterministic.
In that a non-perturbative Space-Time Energy analysis environment facilitates an unbroken logic chain from observed QE event to QE source, any necessity for numeric manipulators which introduce probability and/or normalization functions, is eliminated.
At QE-scale, a Space-Time Energy system component is irreducible... i.e. a minimum/indivisible unit... but if uniform throughout the system, is it a reductionist system?
System analysis at electron scale may suggest that "... a pair of entangled electrons is not a reductionist system as it is impossible to predict the behavior of the systems through an analysis of the individual electrons. An entangled electron pair must be treated as if it is an indivisible system composed of two electrons."
Is an "electron" a detectable spin interaction event, or a spatially defined unit of Energy?
If as an event, an electron's mass is associated with the QE participants of the spin interaction.
If as a spatially defined unit of Energy it is comprised of QE... i.e. is divisible.
In either case, two electrons may appear "entangled" at electromagnetic scale, but at QE scale, "an indivisible system composed of two electrons" can be resolved as QE-spin interactions... i.e. electrons appear entangled as a consequence of QE dynamics... and all QE-spin interactions are determinant within a non-perturbatve emergence analysis environment.
"As an example of a deterministic and reductionist system, classical mechanics utilizes well-defined properties such as mass, position, velocity, and time that interact with each other in well-defined manner."
The "well-defined" qualifier in the above statement, raises fundamental derivation methodology issues... i.e. only within a framework in which Space, Time, and Energy are well-defined, can mass be considered a well-defined system component.
Ref. "Physical Space and Physical Time: What are they?"- D. Oriti
Newton's application of numeric manipulators to perturbatively derived evaluators as a means to "...derive a new property called ‘force’", has obscured the fact that substance dynamics requires a momentum mechanism that inherently resolves all forces as derived of a single Force.
In that QE and QI are indivisible, and a pulsed momentum mechanism implys an indivisible temporal unit (Q-Tick) as the pulse rate, calculus manipulations are not necessarily applicable to evaluation of Space-Time Energy emergence analysis.
As a consequence of misapplied numeric manipulators, Science should not reject "new" fundamental element derivation methodologies... i.e. to "add to the vocabulary of mathematics" may generate relational statistics, which are not applicable to Space-Time Energy emergence analysis.
"Relativity and the standard model", are conceptualizations that rely heavily on perturbatively derived system components and operators, and their formalisms are unable to objectify required fundamental elements of emergence... i e momentum mechanism, substance, and distribution intelligence... which inhibits resolve of all forces as derived of a single Force.
Mathematically manipulating undefined system components and operators... i.e. mass, space and time... to derive "force" has created an illusion... i.e. Force is a consequence of an emergence momentum mechanism, not a property of math manipulations, and its properties are revealed in the distribution resolve dynamics of QE, as minimum/indivisible units of potential motion, within an intelligent distribution framework.
"Thermodynamics is the prototype for systems that are non-deterministic...", but as discussed above, molecular heat phenomena is deterministic if an analysis reduction that facilitates temperature/motion measurement at molecular dynamic scale, is made.
Within a non-perturbative distribution framework, in which the.QE distribution structure functions as a networked intelligence, all motion is reducible to QE scale dynamics, which are, if given adequate computing resources, deterministic to Q-Tick= NOW, and resolvable to Q-Tick= NEXT.
"Thermodynamics led to the discovery of the property called entropy. Entropy is a measure of the uncertainty, disorder, or randomness of a system."
How appropriate is it to apply entropy to a system analysis of the universe, if the universe is a resolvable/deterministic system?
A thermodynamical necessity for "entropy" is indicative of a lack of a non-perturbative Space-Time Energy emergence model.
"Quantum mechanics is one of the few examples of a formal system that attempts to describe a system that is both non-deterministic and non-reductionist."
In that QE dynamics... i.e. quantum mechanix as opposed to "quantum mechanics"... has to be solved for the entire system, on every Q-Tick, obviously NOW has been system inherent Networked Intelligence solved... i.e. determined... and NEXT is, as a consequence of system inherent Networked Intelligence, resolvable.
The inability of "quantum mechanics", as formalized mathematics, to establish QE scale dynamics... i.e. Quantum Mechanix... as a resolvable/determinate system, inhibits application of system inherent Networked Intelligence.
"The mathematics of quantum mechanics, linear algebra over a complex Hilbert space, was discovered by chance as a way of explaining the confusing data that was discovered in the beginning of the 20,- century."
The current state of Science, and consequently human functionality, suggest it is a prudent time to pursue a Space-Time Energy "field" model that resolves "confusing data"... e.g. the before mentioned description of an "entangled electron pair."... rather than continue to mathematically manipulate... i e. normalize... "confusing data".
An "... increasing reliance on non-deterministic/non-reductionist approaches demands that mathematics develop a formal model of the concept of uncertainty."
A "formal model" must define "uncertainty" as differentiated from its polar opposite... i.e. certainty.
Given "certainty" as the result of a process that can establish an unbroken kinematic logic chain from conditional at any time now to conditional at any prior time, one can know "How"... i.e."How" necessitates a logic domain that preserves differential logic operations.
Given that "certainty" infers a requirement to know "Next", necessitates a logic domain that facilitates inherent resolve of "Next".
If "certainty" is as the result of resolve and determinism, the antithesis would be: go forth without resolve and leave an indeterminate "How" in your wake... i.e. the "concept of uncertainty" may not yield to mathematical formalization.
To eliminate "uncertainty" the UQS CAD SIM formalization embeds "certainty" within a objectified universe, comprised of a Spaceless-Timeless Logic Domain as the Resolve/Intelligence component, which experiences/perceives a Spaceless-Timeless FEELING of Space-Time Energy distribution dynamics, and pulse sources QE to Spontaneously, Harmoniously Resolve (SHR) NEXT within the Space-Time Logic Domain, as the determinate component, on very pulse... i.e. Q-Tick.
Within the Space-Time Logic Domain, all Space-Time Energy differentials are a consequence of an emerging resolvable/determinate system... i.e. deterministic to NOW, and resolvable to NEXT.
Space-Time differentials necessitate 2-bit logic operators, which facilitate inference... i.e logic expansion that maintains determinism.
In that "uncertainty" is a consequence of abstraction... i.e. conceptual expansion that does NOT maintain determinism... it is not surprising that "... uncertainty is present in mathematics in the form of concepts...".
Objectifiable entities, as QE, or configurations of QE choreographed composites, within the Space-Time Logic Domain component of the universe, are resolved... i.e. a kinematic logic chain from entity QE emergence to current QE/QI configuration, exist.
Mathematical manipulations that break that chain are not applicable to Space-Time Energy emergence analysis.
Abstracted relationships between visually objectifiable entities can augment cause and effect analysis, but application of an abstraction that invalidates determinism of an entity upon which the abstraction is applied, will invalidate the analysis.
The "uncertainty" attributed to an inability of classical mechanics to "... describe how three objects are attracted to each other gravitationally...", is as a consequence of the limitations of perturbative analysis... i.e. it is not inherent within a Space-Time Logic Domain component of a universe, in which the mechanix of emergence and subsequent distribution of irreducible objectifiable entities is resolvable/deterministic.
The inability of quantum mechanics to resolve "... both the position and the velocity of a mass with absolute certainty.", is as a consequence of the limitations of perturbative analysis... i.e. it is not inherent within the universe's Space-Time Logic Domain component, in which the mechanix of emergence and subsequent distribution of irreducible objectifiable entities is resolvable/deterministic.
"Probability" as a mathematical devise... i.e. numeric manipulator... contrived as a means to evade "uncertainty", inhibits momentum to establish an emergence model that exhibits distribution intelligence ... i.e. a resolvable/deterministic system... and thus eliminate "uncertainty".
In that "The mathematics of quantum mechanical systems manipulate probability distributions (actually they manipulate a relative called a complex wavefunction) to determine how the system evolves over time."... i.e. do not resolve "uncertainty"... and a Space-Time Logic Domain, as a universal component, in which the quantum mechanix of emergence and subsequent distribution of irreducible objectifiable entities is a resolvable/deterministic system, the mathematics of quantum mechanics warrants re-formalization within a non-perturbative analysis environment.
Although my 2023FQXi Essay: "Digital Science: Emergence of Quantum Consciousness" (http://uqsmatrixmechanix.com/2023FQXiEssay4pdfconv.php) was intended as a demonstration of how a non-perturbative analysis structure would change Science, the demonstration objectified a momentum mechanism as a single point pulse sourced emission of substance, as spatially defined minimum/indivisible quanta of Energy (QE), which inherently resolves all forces as derived of a single Force, and consequently FQXi rejected my essay as being an "alternative ""theory of everything"", not an essay about how Science could be different."
"Nentropy is an example of a new kind of mathematics, one that is ideally structured to characterize both the non-reductionist and the non-deterministic properties of a pair of entangled electrons."
In that "... even deterministic systems have uncertainty", how does a nentropy of 0, which "... indicates a system with no uncertainty whatsoever." reliably identify a deterministic system.?
Although utilizing the methodologies of formalized quantum mechanics can generate new mathematics... e.g. Nentropy as a "meta system" qualifier... and determination of cause and effect correspondence validates numerical manipulation, this does not imply that Nentropy as "... an example of a formal model of uncertainty..." is an appropriate tool to evaluate a resolvable/deterministic quantum mechanical system... i.e. a "degree of entropy" necessitates a concept of "uncertainty".
"It is now possible to describe any model of mechanics (classical, statistical, quantum, et al.) in terms of their nentropy, their degree of uncertainty."
A system may appear non-deterministic until the distribution intelligence becomes known... i e. "uncertainty" can be attributed to a lack of conceptual system resolution achievable utilizing inadequate analysis methodologies.
"New" methodologies ... e.g. a non-perturbatve geometry/logic framework... that exhibit a resolvable/deterministic distribution intelligence, for distribution of minimum/indivisible quanta of spatially defined potential motion (QE), justify a re-evaluation of any assessment that quantum mechanix is a "... system that is described by a discrete, finite probability distribution.".
Application of numeric manipulators to "understanding complex systems" must consider appropriateness.
In that the current formalization of the "mathematics of quantum mechanics", does not objectify fundamental elements of emergence... i.e. momentum mechanism, substance, and distribution intelligence... the "mathematics of quantum mechanics" may be applicable to analysis of abstractions which deviate from any requirement for an unbroken kinematic logic chain... i e. develop non-determinate/non-reductionist system characteristics... but quantum mechanix can be shown to be resolvable/deterministic within a non-perturbative analysis environment.
The "... human genome is an example of a non-reductionist system since two or more genes acting in concert often behave differently than the individual genes.", but genes interact as a consequence of the deterministic QE spin operatives as dictated by the dynamics of emergence within a non-perturbative QE distribution analysis environment... i.e. "deterministic" is a matter of analysis scale.
REF: "UQS Consciousness Investigation" (http://www.uqsmatrixmechanix.com/UQSConInv.php)
It is possible "... to re-purpose the mathematics of quantum mechanics" to investigate "... the relationship between the "human genome and cancer..." at other than minimum/indivisible component scale, but if protons, as constituents of genes, can be resolved to QE-Spin resolution, the analysis would facilitate abstraction of an un-broken kinematic logic chain from a gene event, to QE source... i.e. eliminate the "How" "uncertainty" induced by the perturbative nature of the current formalism of "quantum mechanics"... and facilitate system intelligence to effect resolve of "Next".
"... it is possible to apply mathematics to better understand the behaviors and properties of non-reductionist and non-deterministic systems in a variety of unrelated disciplines."
To procedurally establish a methodology "... structured to probe the behaviors of non-deterministic and non-reductionist systems so prevalent in the non-hard sciences.", it is necessary to examine the processes by which non-reductionist and non-deterministic systems are generated.
- Clarify a distinction between inference and abstraction.
- Identify the initial abstraction.
- Identify objectifiable components that underlay the abstraction.
- Isolate any subsequent requirement that invalidates determinism of any underlying objectified components.
Advocating a "paradigm" that undermines the possibility that reality is a resolvable/deterministic system, in order to accommodate the numerically manipulated abstractions that have generated non-deterministic and non-reductionist systems is not justifiable.
"It is indeed both telling and a bit ironic that the non-deterministic/non-reductionist system called quantum mechanics is recognized as the most accurate description of reality ever developed."
What if reality is a resolvable/deterministic system?.. i.e. CAD SIMulations derived from application of "new" non-perrturbative methodologies to Space-Time Energy emergence investigations, suggest that any formalization of quantum mechanics that portrays quantum reality as a non-deterministic/non-reductionist system, is not "the most accurate description of reality ever developed."
Thanks for an informative and well written essay that demonstrates the subtle means by which momentum for a fundamental methodology paradigm is being inadvertently constrained.