Constantinos,
Thank you for your interest, and for forwarding Hayrani's comments... I'll follow up via email.
Ken
Constantinos,
Thank you for your interest, and for forwarding Hayrani's comments... I'll follow up via email.
Ken
I liked your essay. I think that your boundary induced quantization is similar to a path integration condition. Certainly with respect to past and future this seems to be the case. The individual paths will constructively and destructively interfere with each other so as to match the endpoint (BC) conditions.
There is a bit of a point which I am pondering. The big bang as a boundary for quantization makes sense if the spacetime is classical or continuous at the start. Otherwise, you BIQ is approximate. If spacetime in the very early universe has quantum fluctuations, or is quantized, then I am less certain on how one can apply that as a boundary. If on the other hand spacetime may be fundamentally classical, where quantum gravity only refers to some other field from which spacetime emerges, then this theory should be more exact. Even still I am not sure how one would treat the quantum field spacetime emerges from.
Cheers LC
Hi Ken,
good to see you and congratulations for this lucid essay. You have a deep understanding of the quantum and the ability to explain it so well. And I don't say this just because I agree so much with what you wrote :) [paper, video]. I would like just to comment that the system is quantized, with or without the boundary conditions. What these conditions can bring in is the discretization of the spectra, as you explained so well.
Best regards,
Cristi Stoica, Infinite Resolution (this year I focused on singularities in GR)
Thanks for the kind comments... I was waiting to reply until my latest paper (relevant to your first comment) was up on the arXiv, but it now looks like it's going to be another week, so I'll post the link later.
As far as the point that you're pondering... Just because something is "classical", does not mean that it must emerge from some deeper quantum level. One can look for quantum phenomena to emerge from a classical foundation, rather than the other way around -- the central point of my essay, really. Still, I think it's important to take a sufficiently broad view of a "classical foundation" -- say, a local Lagrangian field density on a classical spacetime manifold. From that unproblematic starting point one can study various "nonclassical" rules and constraints, and see how they might lead to higher-level quantum behavior.
Best, Ken
Hi Christi,
Thanks!
But I'm confused: what do you mean by "the system is quantized"? Clearly not the discrete outcomes... What features of a generic system determine whether it is "quantized", if not any discretization? (Deep question there, I know, but you seem to have something particular in mind...)
Cheers,
Ken
Ken,
Hmmm. I don't know if you have swayed me on this, but I will say that I have a much better understanding of the block universe concept now. I still think it is discrete on some level, but I think I'm thinking of discreteness in a slightly different way. So, kind of like the reverse of the usual way of thinking, imagine that everything is locally continuous (which I don't necessarily think it is, but let's just suppose for the sake of argument it is). How could you tell if your little local part of the universe wasn't just some discrete point in a much larger system? Or, for that matter, what if our universe is a discrete point in some strange system of multiple universes? Wacky stuff, but hopefully it illustrates the way in which I envisage "discrete" here.
Ian
Dear Ken
That was the most clearly written and beautifully insightful essay I've read. It also gave me many new answers, viewpoints and much confidence on my own model of discrete fields.(DFM) I've also learned a lot from your refined explanations. You certainly have a 10 from me, but what I'd like from you to read my own logic based but rather agricultural local reality iteration of.. ..well really of what you seem to be suggesting may be true. (which shows limits to Bells domain). http://fqxi.org/community/forum/topic/803
I've been struggling with it as I can't seem to falsify or find the errors in how the DFM seems to fully unify SR/GR and QT. It uses a real quantum symmetry breaking boundary transition process implementing energy changes in an underlying field structure. It also made my hairs stand on end what you referred to the ultimate boundary of the big bang, as I recently posted a short pre-print paper reaching a logical and very physical solution to how exactly that.... anyway the paper, which only took 2hrs to write as a part derivative of a full one under consideration, is at; http://vixra.org/abs/1102.0016 I would really appreciate you reading both, and advising me precisely where the errors are as unfortunately no-one has found them yet.
There are a number of other papers looking at implications, which are quite extraordinary, seeming to resolve issues right across physics. It seems to suggest our failure has been one of complex logical thought involving visualisation skills with multiple variables, and over reliance on mathematical abstraction.
I'll say no more for now, but just thank you, for your essay, and in advance for your time and hopefully comments.
Best wishes
Peter
Hi Ken,
I did not intend to be cryptic, I just wanted to be concise :). Thanks for the feedback, indeed I need to detail. If I understand well, you start with an equation describing a quantum system (e.g. Schrödinger, Klein-Gordon or Dirac). Then exhibit in the system described by that equation discrete behavior from appropriate boundary conditions. I think you did right. I think that the wavefunction is fundamental, and physical (and I don't think that the original idea of Schrödinger, who interpreted the square of the wavefunction as charge density of the electron, is that bad, only that it has some trouble when more particles are involved). This is why I agree with your approach. You name this method "Boundary-Induced Quantization". I think that your usage of the word "quantization" is appropriate, because it shows that discrete behavior arise from a continuous field (which can be the wavefunction, an electromagnetic field etc). On the other hand, what I intended to point out is that there is a standard usage of the term "quantization". This usage refers to procedures which are applied to a classical theory in order to obtain a quantum one. In the Hamiltonian of a classical system one replaces the classical variables (functions) with operators. This way, we obtain the Schrödinger equation from nonrelativistic systems of classical particles, the Klein-Gordon and Dirac equations from relativistic systems of classical particles, QFT from classical fields.
Schrödinger started with his equation, which, according to the definition above, represented a quantized system, and showed that for electrons bounded in an atom the only possible eigenstates of energy correspond to discrete modes. So, he explained the discrete part of the spectrum of the energy of an electron, in this way. We can distinguish two steps. 1. Obtain quantum equation from classical one, and 2. Show that bounded electrons exhibit discrete behavior. According to the terminology I mention, the first step is quantization. According to the meaning of the word, I agree that you can call the second step quantization too. I just wanted to clarify, because one may wonder what is the relation between BIQ and canonical quantization, geometric quantization, various prescriptions for the "second" quantization etc.
Given that I see two steps, and I associate BIQ only with the second, I need to mention that I do not intend by this to say that your view is incomplete. The first step, passing from a classical description to a quantum one, is only due to the historical accident that we understood the classical systems before discovering the quantum behavior. The fundamental one is the quantum system, and there is no need to show how we go from classic to quantum. It is an artifact due to the original impression that the classical is obtained immediately just by h->0 (which turned out to be insufficient). What we need to show is the reverse, how to obtain classic from quantum.
Best regards,
Cristi
Ian -- Yes, of course you're right that there's no way to prove there's not a discrete substructure, and even if one was found, it would still be possible that there was a continuous sub-substructure under that! (etc., etc.)
But my point is that if you take QM measurements away, there's no *evidence* that anything is discrete. And if those same discrete measurements can be explained as an emergent feature of a continuous system, as I'm proposing, then there's no evidence for anything fundamentally discrete at all.
Sure, it still may turn out to be that way, but one shouldn't just instinctively point to QM as evidence that reality is discrete, especially given the measurement problem.
Hi Cristi,
Ah -- I now see where you are coming from, but I disagree with the "out" that you've provided me. I do not think physicists should simply "start" with operator-valued equations and explain classical physics as some limit of those equations. Especially given that there is another approach.
It turns out that the Klein-Gordon equation *is* the classical equation for a classical scalar field. There's also a classical Dirac field (see the last chapter in Goldstein on classical fields.) There's nothing "quantum" about these equations until you start interpreting them via operators as you describe. Yes, if you start with particles you have a problem, but recall my premise is that everything is continuous, so one is forced to start with classical fields anyway.
So why do we then go to operators? It's the easiest way to get to a framework that can predict discrete outcomes. But if there is some other way to get a near-discreteness without operators -- as argued in my essay -- then there would never be any reason to do your "step #1" in the first place.
Now, after several years spent hoping that one could get all of quantum theory by applying closed-hypersurface boundary conditions to classical field equations, I've finally come to terms with the fact that this alone isn't going to work. But I'm far from giving up on the classical field framework itself. (I've just dropped back from field equations to the classical Lagrangian densities that generate those equations in some -- perhaps approximate -- limit.) After all, if you "start" from an equation that one can't even interpret, none of the consequences are going to be interpretable, either.
Best,
Ken
Hi Peter,
Thanks for the kind words -- but I'm afraid I don't really see any connections between our two essays. Still, I'm glad my essay gave you some useful ideas.
My only comment on your essay would be that I think you would find it beneficial to treat light as a wave, especially when it comes to analyzing light in a moving dielectric or plasma. The distinction between phase velocity and group velocity is particularly crucial to your analysis (the phase velocity in a plasma is actually c*n, not c/n, for example.)
Ken
Dear Ken,
I understand. My view is also that we need to start from the classical field equations, and not from the operator-valued ones, and that we need to add some constraints on this field to "manifest the quanta". While you rely on boundary constraints, I postulated a "principle of integral interaction", and I speculated that such a principle will arise somehow from the topology of the spacetime and the gauge bundle [here].
Best regards,
Cristi Stoica
Ken
Many thanks. I agree with the wave treatment. You'll have noticed I consistently referred to signal not phase velocity to avoid confusion. I've studied and researched optics for many years and there is still poor understanding, within but particularly outside optics. Optic Fibre and plasmon science has helped, but, well just look at; Nano letters DOI;10.1021/nl103408h. and Science, vol331,p892. to see how poor the science of just a few years ago was.
I wrote a paper clarifying much re; superposition, harmonics, plasma and refraction, but to the specialist editors it's not 'new discovery' just a clearer way of explaining what we've already discovered, and to general journals it's too far from the ruling paradigms to be considered! We have to smile!! I've now been asked to agre to publication is a less mainstream journal. What does one do!?
I could have written a whole essay on the wave aspects, but omitted it all to avoid red herrings as it is the overview that's important.
I'm not sure if you saw the fundamental derivation from correctly treating time averaged Poynting vectors in co-moving ion media or missed it. It did require slow reading, difficult multi variable visualisation, and consideration of the consequences. Essentially it derives from pure logic SR and GR with a preferred 3rd frame and quantum mechanism, and it's falsifiable.
Or perhaps you disagreed with the logic for some reason? Please do advise if you can find the time. (Don't get confused by plasma waves as we're dealing only with the block reference frame of the medium).
Best wishes
Hi Ken,
I think my response to your email is more appropriately posted here, since others may benefit by our discussion.
You write,
" I'm quite interested in new ideas of how to get quantum behavior to emerge from classical fields".
This was also what attracted my attention to your essay, as this is exactly what I am doing in my essay. What I mathematically demonstrate is that Planck's Law for blackbody radiation can be derived using continuous processes, without using energy quanta and statistics. Since Planck's Law is at the very roots (historical as well as theoretical) of modern physics, this result is very significant.
But more than that! In my essay I show that Planck's Law is an exact mathematical tautology that describes the interaction of measurement! This, in my view, explains why Planck's Law fits so remarkably well the experimental data. Check the blackbody spectrum obtained from measurements and obtained from Planck's Law.. "The FIRAS data match the curve so exactly, with error uncertainties less than the width of the blackbody curve, that it is impossible to distinguish the data from the theoretical curve". Naturally, the measurements will be exactly the same as the tautology that describes the measurements.
I also show in my essay why it is mathematically true that energy is proportional to frequency and why the uncertainty principle must hold. But this is only just some of the results in my essay. Too many to list here in this post!
You further write,
"Really, I was stumped at "mathematical identity" -- at that point you are claiming to derive a physical conclusion with no physical assumptions...? Surely there is some link to physical reality in this math, or it wouldn't mean anything. So what's the underlying picture of reality that this math is assuming to be true? "
I fully understand why you were "stumpt" at the mathematical identity nature of Planck's Law. I was anticipating just such response!
But there is nothing unusal about finding mathematical tautologies in physics - and without these having a 'physical basis'! If I was to measure a distance of 3 miles going east and follow that with a measure of a distance of 4 miles going north, and then measure that I am a distance of 5 miles from where I started, do I need to have derived the Pythagorean Theorem using some 'physical basis' in order for this Theorem to apply to my physical measurements? Likewise with Planck's Law, as I show in my essay!
Conserning my photoelectric effect paper. I am surprised that you actually read it since I don't disucss this result in my essay!
You write,
"1) There is no experimental delay between the time that a weak photon source is turned on and the time that the detectors start registering the photons. If the energy had to "build up" over time, one would expect to see such a delay."
As I explain in the paper, the time required for an 'accumulation of energy' h to occur (the minimum threshold needed for energy to manifest) is h/kT. I think you will agree that this is a very short time! I don't think any experimental claims are for a shorter time.
But there is a more general principle about 'instantaneous' that you raise which I find very important. Do you really believe that if the 'source' is turned on at say t=s, the 'sensor' will detect the photon at t=s also? That 'instantaneously' (in the sense t=s) the photon will be detected? I show in my essay that The Second Law of Thermodynamics states that some positive duration of time is required for a physical event to manifest. Physical events have both 'extention' in space as well as 'duration' of time. Your view that events happen 'instantaneously' at t=s in my opinion violates this fundamental Law.
You further say,
"2) A related issue is when the average field is very weak everywhere, but there are many detectors. If the energy has to build up to hv on one particular detector, they would all take a long time to fire -- but in fact one of them will fire quite quickly, as if all the energy in the whole field somehow was "directed"."
This would be a paradox if you assumed 'ballistic photons' carrying an energy of hv (how? don't ask!) following a path trajectory and striking some one detector! But in my view, the 'photon emitted at the source' is not the same as the 'photon detected at the sensor'. These are separate by related events, as I also argue in my explanation of the double-slit experiment). A detector will 'fire' when it has 'a minimal accumulation of energy that can be manifested'. If a detector does not 'fire' it means that it does not have that threshold to 'trip' the detector. You may ask, what happens to the 'lower than threshold' energy at a detector? It's possible that eventually it just dissipates into the Cosmos, undetected and undetectable. Or it may linker around a bit for the next photon to 'strike'. Since all this is below our 'veil of observation', we just wont know.
Finally you say,
" I simply don't understand how you can simply assume that the energy is always exponentially increasing with time"
What is exponentially increasing with time is the 'time dependent local representation' E(t). But this is at the level of 'accumulation before manifestation'. When energy becomes 'manifested', an amount of energy hv (in agreement with the quantization hypothesis!) is absorbed and the 'exponential representation collapses' (see my essay for a fuller discription of this).
Ken in my essay I present exactly what you are also seeking: Quantum Theory without Quantization. This is what brought me to your corner!
Best wishes,
Constantinos
Ken
I was interested in your comments to Constantinos. You seem to be saying he may be wrong ref the delay. In fibre optics the delay is established with great accuracy as polarisation mode dispersal (PMD) delay, somewhat frequency and polarity dependent (birefringence) but fully consistent with that Constantinos derives. This is the 'charging' or 'momentum' delay of scattering.
Also consistent with this and of topical interest are the latest results reported in Science vol 331,p892, and p 16 of 26th Feb NS, where particles were charged and 'bounced off', or were re-emitted by, the fine structure ABOVE the surface of matter, (done here with coated glass). This is equivalent to reflective scattering. The 19th Feb NS (p18) showing plasmons 'grabbing photons' through a nano hole and re-emitting them, when an 'empty' hole won't let then through at all! All equivalent to QED, with electrons 're-emitting' photons, and always at the relative 'c' of the electrons if in a refractive medium co-moving wrt an incident medium. And we find the greater the relative motion the higher the 'fine structure' surface plasma or 'plasmasphere'. Is that purely a co-incidence? The discrete field model (DFM) explores the implications if not. It's consistent with Constantinos and Edwins, and we haven't been able to falsify it yet.
You ask about "how to get quantum behavior to emerge from classical fields". It it worth considering the converse; How to get classic relativity to emerge from quantum behaviour. With refractive dispersion this seems to emerge naturally.
Food for thought?
Best wishes
Peter
Peter,
Thanks for all the experimental facts you brought to my defense! I had no idea there is so much evidence for such 'time delay'. I think Eckard Blumschein would also add to this list the Gompf et al. false measurements of single photon counting. Rethinking this issue over again, I would like to add to this supportive arguments and experimental evidence the Heisenberg Uncertainty Principle. Clearly, QM uncertainty results in some positive duration of time for an amount of energy 'delta E' to manifest.
I think the rejection to my proof that Planck's Law is a mathematical tautology that describes the interaction of measurement is more 'disbelief' than 'refutation'. It cuts so deeply into the grain and fibre of modern physical thinking. It's just hard for physicists to accept.
Best regards,
Constantinos
Dear ken,
really interesting, accessible, clear, enjoyable. Nice introduction explaining your approach to the question and where you are going with it. I definitely want to spend more time re reading it as it is full of good ideas and explanations.
PS.I have used a quote from your essay on the FQXi Time travel blog forum (where you very clearly explain the static nature of space time.)
Good luck Georgina.
Ken / Costas
There's much more on delay time too. Also look at the Mossbauer effect (1957) where the charge/emission scattering delay is attributed to 'recoil.' There is a logical discrepancy here related to continuous processes, which is probably why his results are oft ignored, but the actual results have been repeated and confirmed (At one of the US major universities I think).
The frequency dependence of PMD in fibre optics is fascinating, as it also reverses at a certain frequency! My work focussed on harmonics, which explains this and absorption bands in terms of Huygens/Fresnel principle (HFP), in similar terms to superconductivity. Waves are still very poorly understood!
Peter
Dear Ken
I agree with your arguments about the fact that discreteness is just a consequence of our models, but in the same way continuity is also just a consequence of our models. Until now We have ignored that the properties of nature we see are conditioned by our models particularly by the logic we use to study nature; this is not a philosophical idea but a mathematical reality. On my essay I try to explain how our perception of quantum reality is blurred by the use of classical-logic tools. I would like to hear your opinions about it.
Regards,
J. Benavides
Dear Ken,
Thanks for a fascinating essay. I agree with premise to take the unpopular route of making QT more compatible to GR. Focusing on the measurement problem from the GR POV is both novel and creative. I also would like to point out the work of Joy Christian which I was introduced on the forum of FQXI's very own website, which seems to support your work, although he concentrates on non-locality. He uses topological and division algebra arguments to conclude *that "quantum non-locality" is nothing but a make-belief of the topologically naive.*
Having said that I have one small "quibble" of my own. You wrote: "First and foremost, GR is a theory of spacetime."
It was my understanding that GR was first and foremost a theory of gravity, that includes spacetime. Isn't true that GR is actually agnostic as to the ontology of spacetime? Although gravity is assumed to be the curvature of spacetime, isn't it indistinguishable from a field in an arbitrary background? For example see here. In the words of Kip Thorne, isn't the "curved spacetime paradigm" equivalent to the "flat spacetime paradigm" in GR?
I would be interested in your response, and thanks again for a beautiful essay.
Dan