Dear Armin,

I also belief, that the distinction between actuality and possibility is needed to give a consistent interpretation of quantum mechanics. In my essay - following Carl Friedrich von Weizsäcker - the time structure is essential. The past being factual (something has happened or not), the future being possible. Mathematics is then the imagination of possible acts in the future, which would lead to a constructivistic justification of mathematics. What role plays time in your theory?

I'd would be happy you could read my essay and comment on it.

Many thanks

Luca

    Hi Akinbo,

    Before I respond to your comments, I have a question:

    Above you gave a an elementary textbook definition of wavelength as the distance between two successive waveforms. Am I correct in believing that you hold this definition to be exact? If not why not?

    Armin

    Hi Armin,

    I must admit, learning a new form of logic is difficult, but I still want to ask you some questions even though I did not grasp all of your essay.

    Do you think in your model where you consider a quantum 2-D object which has the potential to become a 3-D object (instead of a 3-D object with one dimension consisting of every possible value) you could still account for interference patterns as seen in the double slit experiment?

    Your talk about our freedom to choose consistent axiom systems made me think about how certain propositions may not be provable in certain axiom systems, but may be provable in others. Have you seen Stephen Wolfram's work where he talks about enumerating different axiom systems and showing what theorems are provable under each? There's a lot of overlap in what theorems are provable in each consistent axiom system, but there are still some differences. How do you think this relates to your view that the "freedom to choose one's axioms coupled with the requirement of consistency should naturally lead us to expect mathematics to be unreasonably effective in modeling reality, but that this unreasonable effectiveness only exists, as it were, as an actualizability until human imagination transforms (parts of) mathematics into an actually effective model of reality."?

    You talk of ZFC (and your ZFCD)... Do you think the axiom of choice is reasonable axiom in a system that contains real numbers which are uncomputable? How could one actually choose one of these uncomputable reals, if they cannot be specified in a finite way. Unlike computable real numbers like pi and e, most real numbers cannot be referred to in an algorithmic way because they have uncompressible information, so therefore there is no finite way to refer to them. Maybe this is where your thoughts on "imagination" comes in?

    Also, is ZFCD trying to be a meta-mathematical theory that tries to explore the ramifications of an incomplete system being consistently extended, from a general perspective?

    Is what you describe as "pro-actually" a form of determinism, where what you define as "actualizably" a more probabilistic view of the future? How would you view the question of whether a 10,000 digit number was prime or not from this perspective?

    You said:

    "Thus, the context can be interpreted as a sample space and the measure associated with the "collapse" of actualizabilities (i.e emergence of an actual outcome) in the absence of an e-spec is just what we call probability. This measure is different from non-probabilistic measures because it is over a set in the outer domain, and thereby captures the concept of probability."

    Is this related to the idea that something may be considered probabilistic until a proper fully-predictive theory is found? Is it related to the mathematical fact that we don't know if we are up against a true and unprovable statement or if we just haven't done enough searching to find a proof? Is something that is undecidable, probabilistic from this point of view? Are you familiar with Gregory Chaitin's omega constant?

    I think you might have some interest in some of the questions I posed at the end of my essay. Somehow many of them seem very relevant to your work. Please consider taking a crack at answering one of them.

    Thanks,

    Jon

      Dear Jon.

      Thank you for your kind remarks, I see that you asked me below many excellent questions. I will respond to them, but because my response will be a long one, I ask for a little time.

      Best,

      Armin

      Dear Luca,

      Thank you for your comments, I will leave some remarks on your site as well.

      In answer to your questions, time plays multiple roles in my framework. this manifests itself among other things in the following:

      1. There is a semantic duality between actualizability in the present world and actuality in a possible future worlds. For that reason my logical operators could also be regarded as temporal modal operators.

      2. The frame work explains quantum phenomena in terms of the absence of a characterization in terms of a (spacetime) coordinate time.

      3. The axiom takes us from the absence of this association to a superposition of two time directionalities, thereby connecting it both with the Born Rule and the fact that the differential equation that gives the time-evolution of quantum states (i.e. the Schroedinger equation) is time symmetric.

      There are other roles for time which I cannot get into actually talking about the physics that this mathematical framework is meant to support, but in short, the concept of time is essential.

      Thank you and best wishes,

      Armin

      Dear Armin,

      You commented upon my paper so early in the process that I've been under the impression that I had read your essay and commented on it. I see that is not the case. As you've recently returned to offer valuable suggestions, I thank you for those and offer my views on your essay.

      You begin by extolling the freedom to choose one's starting assumptions as a great virtue of mathematics, limited only by consistency. You then note that the ZFC is generally regarded today as the foundation of mathematics, yet it is not known with absolute certainty that it is itself consistent.

      Your own contribution, which you label tentative, is based in "free logic" which, as I understand it, deals with the context of potentiality 'surrounding' the inner logic of things that 'exist', with the key operators being: 'Possibly', 'Actually', and 'Necessarily'. Based on these you formulate the Axis of Default Specification intended to extend the ZFC to handle "incompleteness".

      Your first example is designed to illustrate the use of these terms leading to an "emergence specification" which

      "Collapses" the superposition of actualizability to permit the emergence of the actual element,"

      which of course is the relation to the "quantum mechanics" in the title of your essay, and you say

      "The measure associated with the "collapse" of actualizability ... is ... probability."

      You then discuss contextuality in the context of the Peres-Mermin Magic Square which [as I understand it] is a special case of Peres table 6-1 [page 167] of 'Actual and hypothetical outcomes of N quantum tests", in which he claims that none of the possible 'filled in tables' obey the cosine correlation. He then [page 197] proves the Kochen-Specker theorem using 33 vectors in R^3.

      You note "the quantum mechanical situation invoking commuting spin operators" involves e-specs. As I develop these operators in my endnotes [and in more detail in my reference [2]] it is clear that this idealization of spin as a 'qubit' or two-state system leads to non-intuitive results and much of the mysticism of QM is traceable to this idealization.

      As QM is based on analysis of outputs, and a "good" experiment is generally one with discrete output states, the apparent fact that all measurements based on constant magnetic field tend to satisfy this "two-state" requirement is, in some respects, a "self-licking ice cream cone".

      My essay, of course, asks what happens when this overly simple characterization of spin is loosened. You recall that "the quantum state is a superposition state" which is also fascinating, given Matt Leifer's contention (also quoted in my endnotes) that, in 2015, we don't know what a quantum state is, whether it is ontological or epistemological. Yet, in our math formulas, we are free to superimpose whatever they are! And then to talk about the "collapse" of the superposition.

      You state the current consensus, that "the quantum correlations between the particles go beyond what is possible in any classical arrangement, and this was proven in a theorem by Bell."

      Of course my essay claims that Bell's "proof" is a mathematio-logical proof based on over-simplified physics. Even his basic model, before getting into the proof, is contradictory, as he assumes a constant field which yields zero, while he requires ±1 results. But you know this.

      In short, given the false conclusions (in my opinion) of the Bell theorem, you have performed an admirable analysis of the logic necessary to support or explain these false conclusions.

      You comment on my thread that to seriously consider my argument, one must be willing to question the 90 year interpretation of Stern-Gerlach. Yes indeed. The current state of confusion in quantum mechanics is such that, if after almost a century the early ideas have not yet led to clarity [I don't consider Kochen-Specker's 33-ray proof in any way 'clear', with respect to physical reality] it may well be time to revisit both the experiments, with century newer technology, and the early ideas and concepts of spin -- essentially unchanged since Goudsmit and Uhlenbeck.

      Finally, as you end your essay with a discussion of the Feynman path integral, the basis of which is the progression from state n-1 to state n [see your theorem, post equation 6 on page 8] I would call your attention to the second diagram on page 10 in my essay which links the generalized automata-representation I develop in The Automatic Theory of Physics to a Feynman quantum field theory kernel, and show the equivalence of the "next-state-address" in the automata to "potential" in standard physics. This is a novel identification that you may or may not find interesting.

      My very best regards,

      Edwin Eugene Klingman

      Hello Armin,

      I smell a 'dialectic' bait meant to entrap me and I am in a dilemma whether to swallow the bait or not :)

      I was therefore forced to Google "wavelength" and came up with these:

      *Wavelength is the distance between identical points in the adjacent cycles of a waveform signal propagated in space or along a wire.

      *In physics, the wavelength of a sinusoidal wave is the spatial period of the wave-- the distance over which the wave's shape repeats,[1] and the inverse of the spatial frequency.

      Asking whether I hold this definition to be exact or not will certainly take us into debate whether light is wave or particle or both. It would also take us into whether a medium of propagation is present or not. While not shying away from debating those contentious territories, it would becloud the fundamental basis for the validity of SR where we started this discussion from. To repeat;

      Einstein himself says, "But ALL experiments (without exception) have shown that... optical phenomena, relatively to the earth as the body of reference, ARE NOT influenced by the translational velocity of the earth...". Do you agree?

      Now, if your only condition to answer the question is my response, then I confirm that to almost all intents and purposes I hold the definition to be exact!

      Regards,

      Akinbo

      *In addition to the links on this thread, I thought I linked Herbert Dingle's book but I obviously must have done so elsewhere. SCIENCE At the Crossroads. Read these links critically and you will understand my excitement about photon existence paradox and the lack of interest by the establishment in your paradox.

      Hi Akinbo,

      Thanks for your reply, the other definitions of wavelength you gave above will do just as well. Now, just one more question: do you accept the non-commutation relations of quantum mechanics (or, equivalently, the Heisenberg uncertainty principle)? Note that if you don't, then that means that you reject QM.

      Best,

      Armin

      Armin,

      You are a veritable and formidable opponent. Perhaps, you should have been invited to Copenhagen when Bohr and Einstein were debating...

      I cannot swallow what non-commutation relations of QM, to the extent that QM is valid ONLY IF special relativity is valid. So yes, I reject the version of QM that is founded on SR. The experimental successes of QM must therefore have alternative explanations. There can be more than one explanation for an experimental result with one of them being the eventually correct answer.

      Read, Pentcho's latest post on Faster than Light concerning what the postulate of SR we are finding contentious means.

      Regards,

      Akinbo

      Dear Akinbo,

      Thank you for your response. If I understand you correctly, you say that you reject any parts of quantum mechanics that are founded on special relativity, but you left it open whether you accept those parts that are not founded on SR. I can assure you that on the currently accepted view the non-commutation relations (or equivalently, the Heisenberg uncertainty principle) have absolutely nothing to do with SR. So, do you accept them or not? (I remind you that these lie at the heart of QM, so if you reject them, you are not just rejecting parts of QM, but all of it.)

      Best,

      Armin

      Dear Gordon,

      Thank you for your explanation. I think you have misinterpreted Bohr's statement. At any rate, that doesn't matter, because what is relevant is whether your view and mine agree, and unfortunately, it seems that there are some major differences. It already begins with your statement: "if I send you a randomly polarised particle..."

      I thought from my essay it would have been obvious that I am pursuing the interpretation according to which there is no particle before the measurement event. And if there is no particle, there is no particle property (which I believe is what Bohr's statement amounts to). There is nothing that is polarised, whether randomly or not.

      From my point of view, the particle "comes into existence" as an actual spacetime object upon the interaction with the apparatus we call a "measurement", and the state of the actual spacetime system during the measurement (particle plus apparatus) is determined by the relative contribution of all the actualizabilities associated with the underlying object described by the incomplete spacetime vector, and these contributions map the actualizabilities to the concept of a probability amplitude.

      So, to see whether our views agree, you need to just answer this question: In these correlation experiments, are there according to your theory particles between measurements or not? My impression is that there are, and this would be consistent with the fact that you call your theory "realist". However, it would be inconsistent with what I am working on.

      Best wishes,

      Armin

      Dear Edwin,

      Thank you for your comments, let me just give a few clarifications on your summary of my ideas.

      1. My framework is based on a combination of free logic and modal logic, however, I am looking into ways of simplifying it so that possibly only one kind of extension of classical logic is needed.

      2. Your comparison between the states in 2-D Hilbert Space and the "self-licking ice cone" is funny, but I think it understates the fact that it is already capable of modeling classically highly unexpected behaviors, and yes, I am thinking of the sequential SG experiments with B-fields aligned along different axes. I know that you deny that the actual experiment is modeled by it, but this does not take anything away from the fact that, as a model, it makes non-trivial non-classical predictions and serves as an excellent way for understanding higher-dimensional Hilbert Spaces.

      3. I think Matt Leifer's claim about the quantum state was not about wondering whether it is a superposition state or not, but about wondering about its ontological status.

      4. The theorem that connects my ideas to the path integral was the center-piece of the essay. In my opinion, there are nowadays so many reformulations of quantum mechanics that it has become a veritable cottage industry, and the problem which, as far as I know, most if not all seem to have in common is that they don't make it any clearer what is "really" going on. I like to think that my model does. Although I don't know anything about your automatic physics, if it is based on automata, then the first question I would have is what objects in the real world these automata correspond to. Without this information, it would seem like it is just another "black box" reformulation.

      Thank you again for taking the time to write your comments.

      Best wishes,

      Armin

      • [deleted]

      Dear Jon,

      Thank you for your comments and for your questions. I will try to answer them as best as I can. Keep in mind that I have not yet completely worked out everything, but I like to think that I am close.

      "Do you think in your model where you consider a quantum 2-D object which has the potential to become a 3-D object (instead of a 3-D object with one dimension consisting of every possible value) you could still account for interference patterns as seen in the double slit experiment?"

      The mathematical model alone cannot do this because it doesn't explain 1) where the phase factor comes from and 2) How the double slit potential is to be modeled.

      However, if you combine the mathematical model with the physical model I presented at my talk in Vaxjo, then I believe it is in principle possible because the physical model supplies basis from which one can get both: The phase factor comes out of a postulated mechanism that compares the passage of time in areatime to the passage of time in spacetime, and the potential corresponds to where you set your point of origin in the abstract plane formed out of the two proper times. I say "in principle" because I have not yet performed the calculation, as I have been still focused on much more basic issues in my framework.

      "Have you seen Stephen Wolfram's work where he talks about enumerating different axiom systems and showing what theorems are provable under each?"

      No, I was unaware of Wolfram's work in this area. I just read the linked talk and found it very interesting. Essentially, it appears he has taken this idea of exploiting the freedom of choosing axioms several steps further. I think that this work has the potential to be very useful in mathematics, but because the space of possible axioms is infinite, you will still need something like imagination to pick out the most useful ones. Reading his talk made me wonder whether it will ever be able to simulate imagination in machines. I think that would be both a profoundly exhilarating and terrifying prospect. HAL had imagination.

      "There's a lot of overlap in what theorems are provable in each consistent axiom system, but there are still some differences. How do you think this relates to your view that the "freedom to choose one's axioms coupled with the requirement of consistency should naturally lead us to expect mathematics to be unreasonably effective in modeling reality, but that this unreasonable effectiveness only exists, as it were, as an actualizability until human imagination transforms (parts of) mathematics into an actually effective model of reality."?"

      Remember, we are talking about modeling nature, that is, we are talking about something within the realm of physics. As a physicist, I really don't care that much which axiom system to use as long as it serves as a foundation that gives me the desired model of the world that has the qualities that I desire: Above all, predictions that match real world observations, but also conceptual clarity, relative simplicity of the calculations, and some elegance wouldn't hurt either. Most physics models are several layers separated from the axiomatic foundation of mathematics. In that sense, I would say my work is atypical. But that is the direct result of trying to incorporate a new distinction into mathematics which beyond the level of logic, and possibly some poorly explored models of set theory simply didn't exist. I must admit that until about 1.5 years ago, when I first started the effort of learning about these formal systems, I was not all that interested in mathematics for its own sake.

      "You talk of ZFC (and your ZFCD)... Do you think the axiom of choice is reasonable axiom in a system that contains real numbers which are uncomputable?"

      Well, I can live with non-constructive proofs, and I can live with the Banach-Tarski paradox. I think my pragmatic physicist side is showing its side when I admit that I prefer a more powerful formal system over a less powerful one, even if, as a side effect, it proves to be sometimes "too powerful" because it allows you to derive highly non-intuitive results. The cap on this is of course provided by consistency. I don't want a system that is so "powerful" that you can prove literally anything at all. Incidentally, I am certain that once the formal system I am working out is complete, there will be highly counterintuitive implications lurking in the background, waiting to be discovered.

      "How could one actually choose one of these uncomputable reals, if they cannot be specified in a finite way."

      Actually, Wolfram's work to which you pointed me might be a possible way to do it. If Wolfram's system could be used to enumerate many different very similar but not identical set theoretic models based on the enumerated axiom system, perhaps it might be possible to devise an algorithm which chooses the model in which you can approximate the number to the desired level of precision. The analogy that pops into my mind is that of traditional musicians and filmmakers, who use notes and individual frames, respectively, as a their basic building blocks, whereas "mash-up artists" use entire blocks of these as their basic building blocks.

      "Maybe this is where your thoughts on "imagination" comes in? "

      Most definitely, I believe devising any sort of algorithm from scratch for a particular purpose requires at least a modicum of imagination.

      "Also, is ZFCD trying to be a meta-mathematical theory that tries to explore the ramifications of an incomplete system being consistently extended, from a general perspective?"

      I don't think so. ZFCD is a set-theoretical model, and all the relevant propositions pertaining to incomplete systems are made within it, where I assume by "incomplete system" you mean objects such as incomplete pairs etc. However, ZFCD may well have metamathematical (as well as metaphysical) implications, because the lintroduction of the formal distinction between actuality and potentiality into established mathematics is (at least in my totally biased opinion) a very important and largely unexplored area.

      "Is what you describe as "pro-actually" a form of determinism, where what you define as "actualizably" a more probabilistic view of the future?"

      Well, you have the right idea, but I wouldn't put it quite that way. I would say, as a conjecture, a world is deterministic iff in this world every actualizability at every moment is a proactuality. Actualizability is the more general concept which captures the ontological distinction; pro-actuality is a more specific concept which applies it to the context where the actualizability could most easily be confused for actuality. For instance, someone thinking that measuring some property of one of an entangled pair of particles causes the other to "have" the corresponding property at that moment reflects, in my opinion, such confusion.

      "How would you view the question of whether a 10,000 digit number was prime or not from this perspective?"

      As you asked it, without any further qualification, I would interpret the question to ask about an actuality because numbers, without further qualification, are abstractions of objects in the inner domain. If you had asked me "suppose this number was in the outer domain", then my answer would be that you are asking about a proactuality, because numbers in the outer domain are abstractions of objects in the outer domain, and no elements of the outer domain are "actual".

      "Is this related to the idea that something may be considered probabilistic until a proper fully-predictive theory is found?"

      No, I meant to simply point out that what distinguishes probability from non-probabilistic measures is this aspect of "coming into existence" that the others lack. As you know, Kolmogorov's axioms are not sufficient to capture the concept of probability. There is no way you can conceptually frame a unit length or a unit mass as a measure of "coming into existence". But that is not the fault of probability theory, because it just works with the sets that it is given by the set theoretical model. That is why I think that the outer domain is an important addition to set theory: It is exactly the home of the probability measure. The claim that "something may be considered probabilistic until a proper fully-predictive theory is found" strikes me as far too metaphysical for mathematics.

      Is it related to the mathematical fact that we don't know if we are up against a true and unprovable statement or if we just haven't done enough searching to find a proof?

      This is a great question. Although the "coming into existence" could refer merely to one's certainty of belief, I am at this point only treating probability in an objectivist manner. And from an objectivist point of view, it seems to me that it is already a matter of fact whether a statement is true but unprovable, or whether it can be proved in finite time. And that means that if the proof exists, it is already determined to be a pro-actuality (ignoring facts of reality that the computers could catch fire, or that in a few billions years our planet will be destroyed etc.), and if it doesn't exist, then it is already a matter of fact that it doesn't.

      "Is something that is undecidable, probabilistic from this point of view?"

      I assume you are referring to undecidability within the context of propositions. I suspect there is more than one way of relating this sort of undecidability to my framework, and for this reason I am not quite sure. However, I think if one really set out to relate undecidability to probability in it, then it would be possible to do, but that would lead to a very unfamiliar conception of probability. On the other hand, there is much that is not well understood about probability. Who knows if we have really exhausted all the possible meanings it could have?

      "Are you familiar with Gregory Chaitin's omega constant?"

      I was not familiar with it, and even after reading the wikipedia article, I do not have a good intuition for it.

      "I think you might have some interest in some of the questions I posed at the end of my essay. Somehow many of them seem very relevant to your work. Please consider taking a crack at answering one of them."

      OK, you asked:"If quantum mechanics is a world where things can be both "yes" and "no" at the same time, should experimental results be analyzed with Zen Koans instead of logical inferences?"

      I would say that it is not the case that "quantum mechanics is a world where things can be both yes and no at the same time" but rather that within the domain of objects described by quantum mechanics, there are simply no things with actual properties describable in terms of "yes" and "no" until they are "measured". Zen Koans, I think, while often bringing home illuminating insights, do not seem very efficient (or even workable) as deductive systems to me (but then, I know very little about them).

      I like the question in particular because it demonstrates how the confusion between actuality and actualizability can create a much more pervasive confusion in our worldview. If I had a fair coin in my hand, not yet flipped, and you asked me "is the outcome the flip heads?" and I suffered from the same confusion with respect to actuality and actualizability, of course I would say "Yes and no". Since we know what is really going on, does that not seem silly?

      Thank you again for your questions.

      Best,

      Armin

      Dear Armin,

      It appears we are leaving substance and now chasing shadows. The simple question I asked has been left unanswered or answered with other questions. If I may repeat:

      Given a light source, e.g. a pulsar say 10^3 light seconds away, and sending out pulses once every 60 seconds, such that the moment a pulse is detected, another is already emitted and on on its way and would be detected also after 60 seconds. So we have regular detections every 60 seconds. Now if, on detecting a pulse, the observer moves towards the next incoming pulse, can he reduce the detection time to 59 seconds? Again, if on detection, the observer moves away from the already incoming and in-flight photon, can he delay the detection time to 61 seconds? Or does the detection time always remain 60 seconds no matter whatever manouevre or motion the observer makes. In the Einstein quote above, the motion of the observer will not have any effect on optical phenomena.

      Regards,

      Akinbo

      As an aside, check out the Sagnac experiment if you have not encountered it before and compare with the M-M experiment. Also check the latest experimental finding concerning light speed in vacuum (as posted by Pentcho today on the Faster than Light blog)

      This is getting curiouser and curiouser.

      "I thought from my essay it would have been obvious that I am pursuing the interpretation according to which there is no particle before the measurement event"

      Sounds like, the Moon is not there when nobody is looking.

      Regards,

      Akinbo

      Dear Akinbo,

      "This is getting curiouser and curiouser."

      Did you read section 6? If you did, you might want to take another look. If there is something that is not clear, I' be happy to answer any questions.

      "Sounds like, the Moon is not there when nobody is looking."

      Can we agree that the moon may be a tad different from an elementary particle, or, for that matter, a molecule? Specifically, you can associate space-time vectors with any location in or on the moon. So, I would not worry about whether the moon is still there when nobody is looking.

      Best,

      Armin

      Dear Akinbo,

      "It appears we are leaving substance and now chasing shadows."

      I don't know what you are referring to. Looking back at the last 7 posts, all I see is that I asked two simple "Yes" or "No" questions, which my "opponent wriggles and wreathes" to avoid giving a straight answer to;)

      What is so difficult about answering my question whether you accept the non-commutativity relations of QM (or, equivalently, the Heisenberg Uncertainty principle)?

      "Given a light source, e.g. a pulsar say 10^3 light seconds away, and sending out pulses once every 60 seconds, such that the moment a pulse is detected, another is already emitted and on on its way and would be detected also after 60 seconds. So we have regular detections every 60 seconds. Now if, on detecting a pulse, the observer moves towards the next incoming pulse, can he reduce the detection time to 59 seconds? Again, if on detection, the observer moves away from the already incoming and in-flight photon, can he delay the detection time to 61 seconds? Or does the detection time always remain 60 seconds no matter whatever manouevre or motion the observer makes. In the Einstein quote above, the motion of the observer will not have any effect on optical phenomena."

      Yes, I promise I will answer this, but could you please humor me, and give me a simple yes or no answer to mine?

      Thanks,

      Armin

      Armin,

      Let me humor you with a NO answer.

      References (not the best I can lay hands on now, there is a recent experiment I can't readily locate now)...

      Violation of Heisenberg's Measurement-Disturbance Relationship by Weak Measurements

      Experimental realization of Popper's Experiment: Violation of the Uncertainty Principle?

      Particle Measurement Sidesteps the Uncertainty Principle

      Scientists Now Uncertain About Heisenberg's Uncertainty Principle

      In summary, Heisenberg's uncertainty principle is a useful conjecture not a law or principle. I have rubbed your back. Now rub mine :)

      Regards,

      Akinbo

      4 days later

      Hi Armin -

      A very interesting and clearly written essay. I recognized your "default specification axiom" from your 2013 FQXi paper, and I agree that a crucial step toward making quantum theory make sense is to learn how to imagine a world in which possibility plays a fundamental role.

      I was struck by your statement that "the state of early 21st century mathematics is such that everything represented by mathematics is represented as an actuality." My knowledge of math is too rudimentary for me to know if that's true, or to judge how well your new logic remedies the situation. But it's clear that our philosophical tradition has always treated the given actuality of things as basic, while possibility has been understood mainly as a kind of defective actuality, as what might actually exist but in fact doesn't.

      I've been thinking about this while reading Ruth Kastner's book on the Transactional Interpretation of QM, subtitled "The Reality of Possibility". Your treatment of the Born rule in terms of time-symmetry reminded me of her theory, which describes a sub-spacetime realm of possible interaction underlying the actually observed events of our world. Though I think her interpretation is excellent as far as it goes, it still conceives possibility only as what's potentially actualizable. I think this misses something essential about the way possibility works in QM, and it may be that this same limitation applies to the approach you have in progress.

      The key for me is that possibilities don't "just exist" in the world; it takes very special kinds of situations to make things possible. I wasn't clear exactly what role "context" plays in your logic, but since you treat it as a "sample space" for measurement, I gather you have in mind the "entire measurement arrangement" that has to be taken into account in writing the wave-function for a system. Kastner's approach, like many others, just takes it for granted that such situations exist, i.e. that it's physically possible to make observations. The focus of my current essay, and also my 2013 FQXi essay, is on what's required in the mathematical structure of physics to do this - to create situations where specific outcomes are physically meaningful, which then contribute to setting up new contexts where new outcomes become possible. The argument is that a number of essentially different kinds of structure are needed for this, as reflected in the remarkably diverse set of physical parameters we need to describe our world. describe our world.

      I hope your project meets with great success... it's not so common that imagination is combined with such clear thinking.

      Thanks - Conrad