This article presents an alternative binary scheme for the statistical inference of variability. The discovery of binary integers in 1703 by Leibniz, co-inventor of Calculus, contributed to the start of the information age. Using this example, we call for a renewed focus on the holistic and change perspectives of the ancient binary orders, from the metaphysical vision of the Chinese legend Fuxi. This exemplary scheme is made possible only with the modern integrated circuit technology and a little-known variability formula from Schrödinger, one of the founding fathers of quantum mechanics.
Chips and Science: Holistic Binary Integration and Processing Inspired by the Ch
Here is an open question:
Let us express the mathematical equations of this essay in the form of a commutator. The commutator [H, A] = HA - AH = (H/A)*sigma2 holds true for large values of m and n, as supported by equations (3), (6), (B3), and (B4). This commutator bears resemblances to Heisenberg's uncertainty principle, [p, x] = -ih/2Pi, as both provide insights into intrinsic variability parameters.
In the realm of the holographic principle and black hole physics, certain theories propose a correlation between the fundamental units of information within a black hole, often also referred to as "bits," and the Planck scale. Perhaps, just perhaps, in the holistic binary integration scheme described in this essay, this commutator holds the potential to transcend the limitations of Moore's law and establish a novel information-theoretical scaling paradigm in the future.
Dear PersimmonHalibut:
Thank you very much for your inspiring essay, which I have studied intensively and which has given me some impetus. Here are a few quick, and perhaps under-thought, comments:
On the interaction between fundamentals and the chip industry:
Wasn't it the case with thermodynamics that first there was the steam engine and then scientists scrambled to understand the physics behind it, which took a good hundred years. I don't see the antagonism between technology and science as critical in terms of content. In the end, technology is also the next step of the physics experiment: they show everyone every day that thermodynamics, aerodynamics, Maxwell's equations, etc. are correct! Saturations are always normal in developments, not only technological ones.
If I have understood the intention of your article correctly, your very general question to the future of electronics results in a probably new, by you proposed (?) procedure from statistical interference for testing today's nanoscale transistors. Is this really the solution to the future of information processing? What do you say about the potential of quantum computers? Are there any alternatives to this? What do you think about memristors in a 3D circuit with neuromorphic logic? Is your approach there perhaps a bit special?
Re Fuxi, Leibniz, Wheeler:
I read your description with pleasure, and since I am plagued with similar questions, your open points struck me especially: (1) The claim that the world is ultimately binary cannot be justified by Fuxi's trigrams, can it? And how? From the fact that in early times (the -900Y mentioned by you I consider as very ambitious) already a binary system is known (whereby according to my state of knowledge it remains open whether it was really understood at that time as logic or even as number system), one cannot conclude surely that the physical nature of the world is binary.
(2) As far as I know, the 8 trigrams in the Chinese symbolism form together with the Tai-ki the figure of the Ba-gua, with which the symbolism becomes somewhat more multilayered, since the Tai-ki is also again binary. Note: In your reference list the bibliography [11] is missing.
(3) Returning to modern physics, what is the rationale for the hypothesis of a world binary at its core, as I understood your assumption. From my point of view, Wheeler's thought leaps, but also Landauer's theorem, do not necessarily lead to a proof of the binary world.
About the binary statistical interference
Unfortunately, I am not very knowledgeable in the area and have to put off until later to find the time and leisure to understand this. (A) Surely the connection to trigrams is merely a set analogy to the harmonic and arithmetic mean formulation, or is there a substantive connection?
(2) The step of arriving at an estimate of a mean through statistical methodology and from there being able to infer a holistic structure, I have not been able to comprehend. What does holistic mean? How can an emergent leap from micro to macro be explained? From your point of view, is "holistic"/holistic equivalent to an estimation of mean values, or have I misunderstood?