Essay Abstract

The impossibility of using quantum nonlocality for controllable signalling is widely accepted in the literature. However, a critical examination of the proof strategies used to establish this claim shows that they are circular, in the sense that they depend upon locality assumptions that grant what needed to be proven in order to establish no-signalling, or which were embedded ad hoc in the formalism of quantum theory precisely in order to block predictions of signalling. We conclude that forty-five years after the publication of Bell's Theorem, the question of signalling remains open.

Author Bio

Kent A. Peacock is Associate Professor of Philosophy at the University of Lethbridge, Alberta, Canada. He received his PhD from the University of Toronto in 1991. He has published on foundations of physics, metaphysics of time, and ecological philosophy, and is the author of a recent study of the history of modern physics entitled The Quantum Revolution: A Historical Perspective (Greenwood, 2008).

Download Essay PDF File

18 days later
  • [deleted]

Dear Kent,

Yours is the most interesting essay I have read so far on this site.

It seems to me that the signalling problem will lead us to the next step in physics. I am inclined to think that quantum mechanics in fact describes a digital communication network. Shannon's theory of communication tells us that we avoid error by quantizing messages, that is by making them as far apart as possible in message space so that the probability of their confusion is negligible. This would seem to explain the fact of quantization.

Shannon's theory also explains the delay in error defeating communications, since the packetizing system must wait for the source to emit a certain number of symbols before it can construct a packet. We might associate 'the universal minimum error proofing delay' with the velocity of light, light speed representing the fastest communication algorithm in the universe.

In situations, where error is not a problem, the error proofing delay is unnecessary and so we might get superluminal communication.

A computer communication network is a digital, logical system, and it seems to me that such a system has a property we might call logical continuity (like a proof or the execution of a Turing machine) , which I feel is logically prior to the geometrical community that physicists struggle so hard to preserve.

Finally, the network approach might lead us to the idea that gravitation is not quantized since, like an ideal power distribution network, it is operating at zero entropy, not transmitting any information, therefore has no possibility of error and so quantization or packetization for error prevention is unnecessary.

  • [deleted]

PS:

In a nutshell, we might map quantum measurement to digital communication and accept that logical continuity (the continuity of a digital communication network) is prior to geometrical continuity, so that the latter is a product of the former rather than vice versa.

Networks are made possible by communication protocols which are algorithms for moving information from user to user. The foundation of every network is a physical layer. Physics we might then see as the fundamental communication protocol in which messages are so simple that we can decode them by integrations, that is by simply adding up the number of letters in a message string. We look at a 'wave packet' as a string, a superposition of 'letters'. Higher software layers in the universal network add more complex algorithms to their communication protocols all the way up to ourselves and beyond.

For my purposes, the digitization of the world vastly increases its entropy (and so the information content of concrete situations), so that I may reasonably propose that the universe is divine, establishng a precondition for theology to become an empirical science.

All the best,

Jeffrey

Write a Reply...