Brian - thank you for your kind comments. After working in the dark depths of concurrency issues in computer science for most of my life, it seemed obvious to me that nature could just as easily "multiplex" its multiple universes on the same physical "hardware" of entangled matter in one universe.

Thank you for your hopes for a good review. I plan to publish the essay formally.

Kind regards, Paul

Dear Angel Garces Doz - thank you for your comments. You clearly understand the implications of my subtime postulate: "The photon is the carrier of time, and the Universe is a network automaton".

I found your essay interesting, and set it on the special pile to be read again in depth. Although I did have a hard time wrapping my head around "imaginary mass states" and "vibrations in the fabric of space-time, at speeds exceeding that of light".

I do not subscribe to the idea of "information content independent of the observer". I prefer the view that observers are part of the same network, as described in the excellent essay by Kevin Knuth in this contest.

My view is that causality is symmetric. There is no privileged role or direction for the observer-observee relationship. For every action there is an equal and opposite reaction. Just as effects must have causes for them to exist, causes must also have effects for them to exist. Measurements of information will thus be different (and opposite in sign) for each observer from their vantage point.

Kind regards, Paul

Georg - thank you for your comment. I have reviewed and rated all 180 essays in this contest.

Kind regards, Paul

Jim - you are welcome. I reviewed and rated all 180+ essays in this contest.

Yes this idea is out of the box; this is why I described it as absurd. I spent years trying to find a hole in my argument, and decided it would be easier to publish it and get shot down in flames if I am wrong.

Subtime is the vector of energy/information that travels with the photon. I describe entanglement as the constant passing back and forth of this energy/information, resulting in no net change for an even number of traversals, and a net change of 1 being indiscernible from n+1 traversals.

This is "dark" because this photon energy/information is "trapped" until something else (a 3rd party) breaks entanglement by making a measurement on one of the atoms taking energy out of the system.

All photons travel at the speed of light, whether or not they are entangled. The difficulty lies in our ability to measure time intervals against a background of time, because such a background does not exist, and therefore cannot be measured. As far as results of measurements are concerned, the following paragraph from my essay pretty much sums up why we appear to see evidence of superluminal propagation in the experimental record:

Since time does not move forward until the arrival of a photon, entanglement can occur over arbitrarily large distances. There is no limit. The only constraint is in our imagination: it is difficult for us to imagine that as humans at the macroscopic scale, that we are living like the flashes of the quantum stroboscope are smoothly joined together. They are not. there are brief flashes of reality during decoherence events with long periods of darkness in between.

Kind regards, Paul

Janko - thank you for your comment.

Occam's razor is the most important principle I live by. It seems to me that subtime is far simpler than any other interpretation so far, so I don't understand your comment. Maybe you mean "for Occam" instead of "against" ?

If anything, subtime is too simple, simplistic even. However, I wanted to get the basic idea on the table for debate first before discussing "optimizations".

Time is indeed symmetric inside coherent (bipartite) entanglements.

Thank you for the complimentary comment on my editing. I am not at a University, I do not get any form of compensation for this work. I have only myself as an editor, although my assistant has been know to find mistakes in my spelling and grammar.

I look forward to more comments when you have read it again.

Kind regards, Paul

Jim - thank you for your message. I followed the instructions on the help page. However, after a fair number of attempts, I was unable to create a link that worked properly when I clicked on it. This why I revered to text url's in my postings.

I didn't have time to go into this any further, so I reported the problem to the fqxi administrators. I am waiting for them to get back to me and point out what I was doing wrong.

Kind regards, Paul

Ralph - thank you for your comment. I tried to write the paper in a style that could be accessible to those who are not professional physicists. I am interested in comments from everyone. You don't have to be a specialist in the field in order to have interesting questions.

Kind regards, Paul

Jonathan - you are welcome. Thank you.

Kind regards, Paul

Amos - thank you for your kind comments. I'm glad you found the idea divinely intriguing.

There are deep issues in the fundamentals of physics regarding what a clock really is. Even Einstein's photon clock considered time only within the boundaries of the clock. The concept of subtime exists only along the photon path between the atoms, not in the empty space beyond both ends. The rate of the ticks is arbitrary, depending only on the distance we choose between the atoms. Of course, nature may choose its own minimum distances according to the Pauli principle, which is what makes an atomic clock such an interesting device.

To answer your question: the amount of subtime (ts) oscillates in sympathy with the bouncing back and forth of photons. Classical time (Tc) appears the same (frozen) to an outside observer. This is indistinguishable from being "dark", i.e., not observable. In this interpretation of entanglement, photons do interact with the screen, but instead of being absorbed (detected) they are reflected back to the source an arbitrary number of times.

I also have a suspicion that this is what Einstein was thinking. However, he was waylaid by Minkowski who insisted that time is built into the fabric of space in his famous 4D-spacetime interpretation of SR. I think this was a huge error.

Kind regards, Paul

Best of luck in the finals Paul.

I'm not sure if my rating was counted, so near the final bell, but you made it into the finals either way.

Have Fun!

Jonathan

Jim - thank you for your kind comments and wishes for good luck. I read Cramer's original 1986 paper a long time ago. The overview you provided in the above link is a great. Thank you.

Cramer's Transactional Interpretation (TI) uses both a retarded and an advanced wave together as an outgoing "offer wave", and then another (180 phase shifted) retarded plus advanced wave as a "confirmation wave". Cramer's concept of an interaction is a "transaction" a one-off completion of a "contract" between an emitter and absorber which localizes only one of a "pair" of conjugate variables from the offer wave. The offer wave and confirmation wave are sequentially ordered on a classic background of time (Tc). Cramer talks about past and future and issues of "retrocausality": committing essentially the same "background of time" fallacy discussed above with respect to Feynman and Penrose.

Cramer stresses the point that the interpretation of a mathematical formalism cannot be tested experimentally and must be judged on other grounds. TI does not therefore have any experimentally verifiable distinctions to other interpretations. Subtime, on the other hand, does have distinctions that can be tested experimentally.

The subtime interpretation (SI) identifies forward and backward photons (not waves) as "time incrementing" and "time decrementing" tokens of information depending on the perspective of the sender or receiver. Unlike Cramer's interpretation, there is no external background of time on which to express these events in sequential order. Retrocausality is a non-issue for SI because it fully incorporates the reversal of time in bipartite interactions. This is why we can account for violations in Bell inequalities (in the "time averaged" experimental record) without sacrificing locality.

In SI the wave nature of a photon is expressed through the helicity of its traversal (clockwise when viewed from the sender, anticlockwise when viewed from the receiver). These helicities are reversed as the roles of sender and receiver are perpetually reversed in the hot potato of entanglement.

In a nutshell: the photon travels one way, then travels back, removing all evidence that it ever traveled there in the first place. The wave nature is expressed in the helical path of the photon (Poynting's rotating shaft) without having to assume a "wave function" that pervades all of space.

Subtime entanglement conserves energy and information in a perpetually alternating "hot potato" protocol. This manifests as an indefinitely "frozen moment" in classical time (Tc). The reality that we assume and perceive in Tc is really a quantum stroboscope: brief flashes of reality with long periods of darkness in between.

Kind regards, Paul

Jonathan - thank you. I would still love to hear your comments after reviewing the paper.

Kind regards, Paul

  • [deleted]

Dear Paul

Thank you for visiting my page. I enjoyed reading your well-thought-out, written, and illustrated paper on the it-bit conundrum. That the photon is the carrier of information per se goes to the heart of the question. I see you teetering on accepting that the Universe can be represented as of one State that goes one tick at a time - i.e that there is no time dimension, but then you draw back and create an ingenious system to have your cake and eat it too: as Einstein conceived it, and as latter-day Mach might have it - all the elements of the Universe interacting simultaneously (if I understood your network concept).

I could not say I completely followed the logic of your Tc and ts completely and am baffled by the infinitely ricocheting photon between two atoms. What happens in a vacuum where no atoms exist? Anyway From the degree of confidence in your essay I feel that you have found yet another ingenious way to formulate what happens in Reality, and wish you all the luck in proving it and finding acceptance for it.

My approach however, is very different and seeks simplicity. Perhaps you can cast a look on my 2005 Beautiful Universe Theory also found here in which the Universe is a timeless lattice of ordered nodes transmitting angular momentum locally, causally and linearly. I have demonstrated how probability emerges from that exquisite order. I do not accept Einstein's point photon concept and was very happy when I discovered that Eric Reiter has independently proven that experimentally. Eric Reiter's website is unquntum.net . I think that is a very imortant finding that deserves much more discussion than it is garnering.

With all best wishes

Vladimir

    Hoang - thank you for your comment.

    Kind regards, Paul

    In response to your post; sorry for the delay, a bit busy.

    I am glad that you read my essay, and I am even more happy that you rated it low.

    I read, and rated, all the essays some weeks ago, and If I remember well I rated high (not 10) your essay: I can say now, because there is not problem now.

    I am thinking, like you and other authors, that the information is transmitted by photons, that transport the information of a system: for me, the information is a measure using photons (or other gauge bosons).

    I am thinking that is right, that the decoherence is obtained from interaction between entangled bosons and other entangled bosons (or gravitational ripples, as I read somewhere).

    It is interesting the use of the single photon to measure entropy change, but I have problem with macroscopic object: if I use a source of photons to read the macroscopic state of a system (for example an image of a picture), then have the system of optics and picture a constant entropy? If I understand well, there are entangled photons for each optics system, that are decoherenced instantly.

    My old idea is that the time change because of the radioactive decay (irreversible process), or spontaneous emission (but my mind is devoid of prejudice).

    I am thinking, like you, that the time does not exist without curvature (in the past or future), or equivalently if there are not bosons, or there are not motions.

    A good essay.

      Dear Paul,

      Congratulations mainly for the splendid essay. I am sorry I did not manage to rate it in time. You may have a look to: http://vixra.org/pdf/1306.0226v2.pdf

      Good luck to expert's rating as well,

      IH

        • [deleted]

        Paul,

        Thank you kindly for directing my attention to your essay; I only wish you would've done so prior to the rating deadline. I found the idea(s) expressed in your paper rather novel and interesting although I did not find them absurd. While reading your paper I couldn't help but think of Louis Kauffman's Virtual Logic where virtual is defined as "exists in essence but not in fact."

        While I'm a big fan of Dr. Kauffman, I'm not such a big fan of the Aspect experiments. I believe Peter Jackson, in his current FQXi essay, explores rather well the ambiguities inherent in the Aspect results. I accept that non-locality is an aspect of nature largely due to the Aspect results IN CONJUNCTION WITH the Mach-Zehnder results of Herzog et. al.. I don't know why the Mach-Zehnder results don't garner more attention in that they are much more amenable to statistical analysis (as opposed to the Aspect experiments) and, in the context of the Bell Inequalities, they hold the same amount of sway. So, while I understand how your idea(s) deal with the Aspect results, I do not fully comprehend how they explain the Mach-Zehnder results as well; perhaps you could address this lack of understanding for me.

        Now, to address your perception of my own essay. In my world view consciousness isn't emergent, rather, it's primal; consciousness, like magnetism, permeates and an entities consciousness permeability is a function of its structural complexity (as defined by Ben Goertzel in his Pattern Theoretics). This position is largely a result of my long-running yoga and meditation practice but it makes logical sense as well.

        David Deutsch, a very interesting thinker in my opinion, observes that information can be transformed to suit a wide array of media and then asks, "What is the general form of information?" Dr. Goertzel, in my opinion once again, gives a compelling argument, with his Pattern Theoretics, that the general form of information is pattern; whether words in a book, bars in a barcode, or bit-strings in a sequence function, the relevant infromation manifests as pattern. In his book FROM COMPLEXITY TO CREATIVITY Dr. Goertzel conjectures that reality (IT) emerges from pattern dynamics; in fact, he conjectures that reality is, at a fundamental level, an evolving ecology of pattern. He models such an ecology as interacting systems of functions on Non-Well Founded Sets. He calls it a magician system with his aptly named magicians and anti-magicians running around casting spells on one another. Their spell casting leads to "structural conspiracies" which is simply patterns (structure) conspiring to maintain one another. These pattern dynamics lead to the attraction, autopoiesis, and adaptation of complexity science. What Dr. Goertzel doesn't, in my opinion once again, adequetely address is, "Where do the primal patterns come from?" I suggest they come from primal consciousness.

        As essay author Joaquin Szangolies has pointed out, the problem with dual nature theories is, "What is the causal nexus between the two natures, between the mental and the physical, the Platonic and the phenomenal?" In many of his White Papers William Tiller, a very distinguished Stanford physicist, concedes that there is no broad concensus as to the definition of consciousness but then points out that we all agree that consciousness manipulates information - pattern. So this leads me to my definition of consciousness: consciousness is the causal nexus between the mental and physical, the Platonic and phenomenal, with said causal ability manifesting as a result of its ability to manipulate (i.e. create, annihilate, transform) information - pattern.

        So to conclude, the gist of my essay is that while, inferentially, the correspondence principle suggests IT from QUBIT, what makes IT interesting is emergence and in order for science to properly understand emergence it would seem necessary to sacrifice the, in my opinion once again, erroneous fundamental assumption that matter and the four known forces are primary. In fact, in my world view the four forces unify under the informative tag "conscious intent." Of course this is controversial . . .

        Thank you once again for directing my attention to your essay and best regards,

        Wes Hansen

          Wes - Thank you for your comments. I enjoyed reading Peter Jackson's essay and commented as such on his page.

          I very much enjoyed the virtual logic paper you referenced by Kauffman. The unity in multiplicity is at the heart of many paradoxes. "It is ONE for the global observer and MANY for the local observer."

          Our mathematical formalisms in QM and GR tend to all be global "God's eye views" (GEV's) which mislead us as to what we can know and predict about the universe. The "Local observer view" (LOV) however can have many unities.

          For example, Maxwell's equations have time symmetric solutions: the retarded exp(-iwt) and advanced exp(+iwt) waves. Wheeler & Feynman explored this in their 1945 absorber paper referenced in my essay. Despite their mathematics coming out correctly by using half-retarded (from the past) and half-advanced (from the future) as the field generated by each charge, they still used a GEV 4-vector (Minkowski) background of time.

          The theory of virtual particles that came out of Feynman's work was intended to eliminate the notion of a field. He laid the groundwork for what we now see as an obvious next step: ditch the idea of a monotonic and irreversible GEV background and replace it with a LOV perspective where photon traversals represent local increments and decrements in time. To relate the "LOV" subtime to what "we" view as "GEV" classical time: simply use the triangle inequality to sum all the "absolute" values of each photon traversal, and voila, we can now see the obvious relationship to the mathematics of quantum theory and entanglement.

          Within the subtime context we associate the departure of a photon from the transmitter atom with exp(-iwt) and the arrival of that same photon at the receiver atom as exp(+iwt). If entanglement does indeed turn out to be a photon hot-potato protocol as I postulated, then of course the net result is zero: energy and information are conserved in the entangled pair.

          I will take a closer look at the Mach-Zender results, and examine if the concept of subtime provides as much insight there as it does for entanglement.

          I will also look at the other references you described. I'm not sure I am qualified to discuss autopoiesis or conciousness. It seems too far up the mesoscopic & macroscopic chain for a simple /direct analysis relative to subtime.

          I feel even less qualified to post an opinion on the fundamental nature of the four standard model forces. It might be that subtime can be thought about in a similar way in all boson/fermion interactions.

          Thank you.

          Kind regards, Paul

          Dear IH

          Thank you for your comment. I have the paper you referenced printed and will read it this weekend.

          Kind regards, Paul

          Tom - thank you for your response. I concur. If there is no background of time, then any one-way measurement of traversal time between two different systems tells us nothing about the instantaneous speed of a single body. I liked your Kepler's orbitals example. Your point is well taken.

          By the very definition of subtime, you cannot measure it without a two-way traversal. The photon has to bounce back and forth at least once, otherwise time (and hence any velocity measurement) is undefined. Even Einstein defined time as "the reading on a suitably-synchronized clock located at the same position as the event". Synchronization is a two-way event.

          I don't readily grasp your connection between punctuated equilibrium and the quantum stroboscope, and wonder if I did not express myself well in the essay on this point.

          I am familiar with Per Bak's work on SOC and Bar-Yam's work on complex networks. This is a good analogy to help us understand the sudden changes at the atomic level, while nothing appears to change at the macroscopic level.

          However, the point of the quantum stroboscope is that "for each atom" there is no experience of the passage of time. The modular arithmetic of subtime recurrences is for all intents and purposes, "eternal". The experience of time is this therefore the chain of interactions which add or subtract information from the system.

          Kind regards, Paul