"No finite set of experiment can extract complete information about any particular object. In our evidence (data) gathering experiments , we study an unknown object by letting it interact with a "known" object . We never know any object completely. So we are forced to approximate, assume, etc. We can keep refining our knowledge through diverse iterations. Mathematical theory is immensely helpful. However, Gödel's Incompleteness Theorem gives us another limiting block. Hence, our search for fundamental building blocks and fundamental laws of interactions must keep evolving". Not necessarily. Since we can know/extract only a finite amount of information, there is a likely convergence point where the difference between our model and reality will be smaller than what we can measure/describe, making our model a final model FAPP. Similar remarks about Gödel's two theorems.
"How to guide this evolution? A working theory, based upon human created postulates and human invented mathematical theories do not automatically guarantee that we can grasp the ontological reality of the universe." Obviously humans cannot describe completely universe, because we are part of the Universe. A 'complete' description/characterization of the Universe would require the existence of a superobserver living outside the Universe, but since the Universe is by definition isolated, such superobserver is useless. As stated above all what is required by a model is to fit reality. It is useless to talk about the "ontological reality of the universe" outside our human limits.
"One of the acceptable ways to define "fundamental", whether a universal building block, or universal laws of operations behind the incessant cosmic evolution, is, that which will minimize the number of independent postulates necessary to build a unified field theory to cover all observable phenomena. Such a definition will help us understand and appreciate that the universe, after all, is one comprehensible system." This is not an acceptable way to define "fundamental" since fields are crude approximations to real material systems.
"We now have too many disjointed theories. Note that all of our most successful theories, from Maxwell's Electromagnetism, to General Relativity, to Quantum Field Theories, all indicate that the universe is emergent out of some complex field." And the three ones are approximate theories which can be replaced by better theories where there is no fields. E.g. Maxwell electromagnetism can be replaced by Wheeler-Feynman electrodynamics or by some superior model.
"Iteraction Process Mapping Epistemology" looks as fancy word vacuous of content, because claims such as it is a "changing system of thinking" and "A map is not the territory" have been present in science since ancient times.
"We never know complete information about anything in this universe. Therefore, we must use our imaginations to create some rational postulates to close the information gap." No, we cannot close the information gap. We can formulate any postulate, but to check if the postulate is correct or not, we have to compare it with reality, and we only can access a part of reality by obvious reasons; which means we cannot really check the postulate. We can imagine anything that we want, but we are not closing the gap. Those imagined postulates, outside the realm of experiment and observation, fall into the scope of philosophy, religion, and similar disciplines whose real contributions to understanding the Universe are easily summarized: zero.
"No working theory is complete. No knowledge is final knowledge." As explained above the possibility of achieving a final theory FAPPP is not ruled out.
"Galaxies are formed slowly through gravitational accumulation of thin gaseous particles. Then the condensed gas creates enormous pressure in its core, igniting a prolong set of nuclear reactions. A star is born. The nuclear reaction then evolves in different directions, eventually creating the demise of the stars through supernova explosions, or through prolong states of red giant. Should we try to re-construct the concept of forces such that they are fundamentally dialectical? This approach may eventually direct us to develop fundamental laws that directs us to better understand whether the universe is fundamentally a cyclic system, or evolving towards an equilibrium, or towards a death!" We need an evolutive vision in physics to describe this cosmic evolution. Prigogine summarized this need of a change of paradigm in the postface of one of his last books:
"Nature has a history-for a long time the ideal of physics was geometry, as implied in Einstein's general relatively. Relativity is certainly one of the great achievement of the human mind. But the geometrical view is incomplete. Now we see that narrative elements also play a basic role. This leads to a different concept of nature in which the arrow of time is essential. After all, this arrow appears as the feature which is common to all objects in the expanding bubble which is our universe. We all age in the same direction; all stars, all rocks age in the same direction even if the mechanism of aging is different in each case. Time, better the direction of time, is the fundamental existential dimension of human life. We discover now that the flow of time is universal. Time is no more separating men from nature."
But this vision doesn't require a change in the concept of force. What we need is an emphasis on the topology of the solutions, including the many instabilities and bifurcations that generate the cosmic evolution.
"Our neural logic system has evolved to assure our comfort and survival within the prevailing scientific enterprise. Our culture trains us to conform to the prevailing working rules and theories. Hence, our enquiring minds have the dominant tendency to create new theories that build upon the prevailing working theories." No. It doesn't have anything to do with culture or neural structures. It has to do with the proper nature of the scientific endeavor. Since theories are confirmed in specific empirical ranges, a new theory designed to explain some phenomena cannot be described by the former theory has only two logical possibilities. Either the new theory is a disjoint theory that only works for the new phenomena or it is a covering umbrella of the older theory. The development of new theories that are covering former theories has a number of advantages and that is why it is the preferred choice in science.
"However, the independent thinking human mind can rebuild theories fresh and anew from a more congruent new set of coherent postulates". Sure, but rebuilding the axiomatic structure of the theory doesn't change the underlying physics. Changing the CKC postulates by the MTE postulates doesn't change the description of thermal phenomena.
There is nothing fundamental in the "fundamental rule of Non-Interaction of Wave (NIW)". Moreover, the physical reason why the wave equation accepts the linear superposition of any linear combination of linear waves is because the wave model is based in an local approximation of the interactions between particles. This local approximation eliminates the nonlinearities associated to correlations that spread over both space and time, and ejects a wave equation which is only valid for times >> than t_corr.
There is not anything "non-causal" in single-particle interference, delayed choice, etc.
"The observable universe consists of various emergent oscillations of the fundamental Complex Tension Field (CTF)". As mentioned above the field-theoretic models are only crude approximations to real phenomena.
"The excited state energies of CTF in the form of EM waves and the field-particles cannot be arbitrarily assimilated by the CTF itself. This provides the rationale for the observed universal law of conservation of energy." As mentioned above, this CRT model is invalid, but even if it was valid, it doesn't provide any rationale for the venerable law of conservation of energy. Stating that excited energies cannot be arbitrarily assimilated by the CFT only provides ground to the trivial concept that d_eE is canceled because the flow is 'conservative', but it doesn't say anything about the existence of a nonzero production. So, as always, the simplest route to introduce the law of conservation of energy in our models is by postulating d_iE = 0.
"CTF automatically accommodates the effects of the two postulates of Special Relativity, satisfying one key definition of what "fundamental" is". Accommodating the 1st postulate of SR is trivial. One can simply start with the expression of energy for a single particle and obtain the velocity in the massless limit. What is more, we can make add gravitational interactions and then demonstrate that the speed of light is no longer constant, making bogus the claim that "EM waves [...] propagate at the universally constant velocity c". This variability on c can be used to explain phenomena as light bending around massive objects.
"This why the basic laws of quantum mechanics we are finding on earth and our solar system, appear to be systematically valid for all the stars in all the galaxies." Only if we ignore the possibility of those basic laws of quantum mechanics need modifications in presence of massive objects. Penrose and others have detailed some a priori expected modifications.
"Thus, we have accommodated the two postulates of SR as direct causal derivatives of the stationary CTF, rather than as independent postulates." Ignoring again that CTF model is invalid, ignoring that those postulates are more general because are also valid in experimental regimes beyond the scope of field theory. The postulates of SR have not been derived, there is only vague talk about how those postulates would follow from CFT. It is possible imagine a modified CFT model where the CFT is split into regions each one with its own set of physical laws. A true derivation of the postulates of SR would need of invoking extra postulates about homogeneity and isotropy of CFT and more. So we finish with a less economical formal system. But this is really irrelevant because the whole model is invalid as a foundation of reality.
"Stationarity of the CTF will change the very foundation of physics thinking [...] Since all planets spin and rotate, no planet-based laboratories should be considered as inertial frame of reference for physics experiments." This is obviously incorrect. Up to extreme precision we can consider our laboratories as inertial frames. It is the reason why general relativity is unneeded to explain CERN experiments.
Tired-light mechanisms have been disproven again and again.
"Removal of wave-particle duality by differentiating Mathematical Superposition Principle (SP) from the Observable Superposition Effect (SE)". Wave-particle duality is a common misconception of quantum mechanics. Any decent advanced book avoids this duality, because it doesn't exist. So, one doesn't need to invoke new postulates and concepts to remove the duality, only a proper rigorous study of quantum theory is required.
Solutions to Schrödinger equation aren't "waves" but a representation of quantum states. Those representations are often named wavefunctions because of historical accident. The one-particle solution looks as a classical wave, but this superficial analogy is broken when one adds a second particle to the problem and the state jumps to a 6D+1 space is no longer isomorphic to the space where real waves are defined.
The claim that Schrödinger 'waves' (really wavefunctions) are at the foundation of quantum mechanics is also wrong. In the first place one can formulate quantum mechanics without wavefunctions and without the Schrödinger equation. In the second place, the Schrödinger equation and wavefunctions are only valid for non-open systems and for pure states. The quantum state of a molecule in a heat batch is not represented by any wavefunction, and we need more advanced formulations.
"For both the cases of superposition experiments, whether using light beams or particle beams, the dark fringes are generated due to the absence of physical stimulations. This is not, as is historically assumed, due to non-arrival of "photons" or particles at these locations." But we can count particles. So we can count the number of particles in the bright regions and check it agrees with the number of particles emitted, invalidating the model where particles impacted in the dark regions without 'stimulating' the detector.