"Your remarks about Ken's approach surprise me. Perhaps I should re-read your exchange with him."
Well, I am critical not because I believe that his models are a bad idea, but because I believe that the conceptual framework needs to be developed more carefully. After all, there are ways of writing down classical theories that make them look like they involve weird causality, such as the Wheeler-Feynman absorber theory, but we know in that case that there are alternative ways of writing things down that have conventional causality. Therefore, it is not enough to say "look, I have derived a model starting from a block picture of spacetime". One also has to prove that there are obstacles to understanding the theory in any other way. To achieve this we need an analysis of the possibilities for such models at least as rigorous as Bell's analysis of theories with conventional causality. For this reason, I am putting his work in the "suggestive argument" category for now and, as I said, there is nothing wrong with being in that category.
"I'm not sure whether you answered the question as to whether non-locality would be obvious had Bell never invented his inequality. If you'd care to clarify it, I'm still interested."
Sorry, I didn't realize that this was one of your questions. My answer is a definitive no. Without Bell's analysis I think the EPR reasoning would stand (or better Einstein's earlier arguments which are less confusingly tied up with the uncertainty principle) and the best response would have been to look for a local hidden variable theory. In fact, I think there are only a few results that point to fundamental difficulties in interpreting quantum theory. These are:
- Bell's theorem
- Contextuality (This starts with Kochen-Specker, but I prefer Spekkens' more general definition)
- Results about the reality of the wavefunction (PBR theorem et. al.)
- Excess baggage theorems (results about how the size of the ontic state space must scale exponentially with the number of systems)
Each one of these theorems points to an explanatory gap. Namely,
- Ontological models must be nonlocal, but they must also be nonsignalling.
- Ontological models must be contextual, but the probabilities must be noncontextual.
- Ontological models must have the wavefunction as part of their ontology, but many quantum phenomena are most naturally interpreted in terms of an epistemic wavefunction.
- n qubits must carry O(2^n) bits of information, but on any way you define the operational information content of n qubits it comes out as O(n) bits.
All other phenomena that I know of can be modelled quite straightforwardly so long as one does not stick to the dogma that reality must be described by particles travelling along definite trajectories. This is why I am not impressed by arguments based on basic interferometry experiments like the double slit.
What the explanatory gaps indicate to me is that there is something wrong with our basic framework for realist models of quantum theory. The right framework, whether that involves retrocausality or some other exotic thing, should close all these gaps, e.g. it should reveal that quantum theory is not nonlocal after all and similarly for the other gaps.