I would like to thank Mehran Shaghaghi for initiating a discussion on their recent paper, “The Physical Foundations of Quantum Theory”, based on their recent paper published in February 2023 in Springer Link: https://link.springer.com/article/10.1007/s10701-023-00673-2.

Abstract:

The number of independent messages a physical system can carry is limited by the number of its adjustable properties. In particular, systems with only one adjustable property cannot carry more than a single message at a time. We demonstrate that this is true for the photons in the double-slit experiment, and that this is what leads to the fundamental limit on measuring the complementary aspect of the photons. Next, we illustrate that systems with a single adjustable property exhibit other quantum behaviors, such as noncommutativity and no-cloning. Finally, we formulate a mathematical theory to describe the dynamics of such systems and derive the standard Hilbert space formalism of quantum mechanics as well as the Born probability rule. Our derivation demonstrates the physical foundation of quantum theory.

5 days later

The paper presents a fascinating perspective and is likely to make a significant contribution to the field. It has successfully resolved several complex issues. This makes me wonder, if quantum systems are just a special case of single-message systems, do we really need a theory of everything?

    Robert McEachern

    Replace one member of an entangled pair, with an exact, pixel-by-pixel, identical (negative) copy of the other member, instead of retaining the original, in which the entangled pairs are only "statistically" identical rather than being "exactly" identical. That simple substitution changes everything...

    that simple change in the nature of the inputs ("identical twins" versus "fraternal twins") entirely changes the nature of the observed correlations between the entangled pairs.

    Robert McEachern
    I suspect those are the same concept. In his definition, a single-message system consists of a single piece of information, which can be any sequence of bits of data, not just one bit (↑ / ↓) of data.

      Michael No, they are quite different in a very important way. It has nothing to do with the number of bits of data in a message; the only thing that matters is the amount of information. The Time-Bandwidth product in Shannon's Capacity expression for the amount of information (corresponding to the product in the Heisenberg Uncertainty Principle [HUP]), limits the number of independent (uncorrelated) measurements that can be made on a system. But the other term (signal-to-noise ratio) in Shannon's Capacity, that has no equivalent in the HUP, limits the number of significant bits in that measurement. Hence, when both the number of uncorrelated measurements and the number of bits-per-measurement become extremely limited, strange things happen. But it is important to realize that a single bit of information, can be encoded in a variety of ways; by increasing the time-bandwidth product (and thus the number of uncorrelated measurements that can be made) while simultaneously reducing the signal-to-noise ratio, or vice-versa.

      2 years later

      Hello, I'm enjoying the conversation so far, If I might, I wonder if the “single-message” framing is missing one subtlety. It seems that the decisive factor is not just the count of adjustable properties, but the effective rank of independent modes once coupling, noise, and feedback are taken into account.

      Formally, you could think of a system’s information capacity as

      𝐶eff ∼ rank(𝑀)⋅log⁡2(1+SNR),

      where rank (𝑀) is the number of controllable orthogonal modes that survive environmental interaction, and SNR sets the number of usable bits per mode.

      Quantum behavior would then arise when the effective rank collapses to one under physical constraints, even if the system has more adjustable parameters in principle. That collapse produces the familiar features:

      Complementarity → mode capacity allocated to one observable suppresses orthogonality in the conjugate.

      Noncommutativity → sequential probes alter which effective mode survives, because the first probe changes the rank structure.

      No-cloning → any attempt to amplify the single surviving mode injects entropy that destroys coherence.

      From this perspective, the Heisenberg uncertainty principle captures only the time–bandwidth side (number of independent samples), while Shannon’s formula reminds us that SNR is equally fundamental. Quantum “strangeness” then looks like the limit of small rank (𝑀) and low SNR simultaneously.

      A technical question this raises: if one could experimentally engineer environments that keep rank
      rank(M)=1 but tune SNR independently (say, by adding controlled spectral noise), would the Born probabilities still hold exactly, or would deviations appear once the information rate falls below a critical threshold?