Angelo, Saikat, & Tejinder,

I am somewhat familiar with Bohm QM, less so with dynamic collapse and I will have to read on trace dynamics. I did read something a while back about how dynamic collapse runs into some difficulties.

Bohm QM works well enough for systems with quantum observables that have a direct correspondence with classical mechanics. Bohm's QM does not work very well without a classical-quantum correspondence. BQM contrary to what is commonly said does work in a relativistic setting. You can write the Klein-Gordon equation in real and imaginary parts, just as with the Schrodinger equation. What becomes troublesome is when you try to do interacting QFT. There is no natural ladder of states from which to describe the production of massive particles. As a result there is no workable BQM form of QED.

These approaches to QFT, which are really forms of quantum interpretations, are minority reports. I did some work on quantum chaos where I used BQM. I used it because it is close to a classical description and is convenient for working Hamiltonian chaos. The response was not good, largely because of the Bohm part. I kept trying to argue that there is nothing erroneous with the writing the wave function in polar form, separating the Schrodinger equation into real and imaginary parts and so forth. BQM is admittedly weak in some respects, but it is not wrong. I don't think BQM can replace standard QM, nor do I think it is likely the others will either. Yet they have their niche.

Cheers LC

    Dear Lawrence,

    Thank you for your interest in our work and for reading our essay.

    We do not quite understand what you mean by your remark:

    "Bohm QM works well enough for systems with quantum observables that have a direct correspondence with classical mechanics. Bohm's QM does not work very well without a classical-quantum correspondence."

    What we have said in our essay is that

    "In this theory, a system of N non-relativistic particles is described by a wave function which lives on configuration space, and by the actual positions of the particles. The positions evolve according to a 'guiding equation' which depends on the wave function, while the wave function itself obeys the Schro ̈dinger equation."

    This, along with the quantum equilibrium hypothesis, completely defines the theory. Of course the Hamiltonian must be prescribed by hand, and yes there one relies on experience from classical physics, but that then is true as much for standard QM and dynamic collapse as for BM. With the above structure, BM reproduces all the results of standard QM. And it does better, because the collapse of the wave-function / the projection postulate are not put in by hand in an ad hoc manners as a recipe to explain outcomes of experiments.

    We agree that one can easily write down the one particle / non-interacting relativistic BQM. But that is trivial isn't it? We always had the interacting relativistic QFT in mind.

    If by standard QM you mean the one with Copenhagen interpretation, we believe it is an incomplete theory. It does not explain what happens during a measurement - it merely postulates collapse of the wave-function without prescribing a mechanism for the collapse. We believe the least one would have to do to complete the theory is to adopt the many-worlds interpretation or BQM. Personally, we have difficulties with many worlds since we do not see how probabilities and the Born rule can come about in MW. So we would prefer BQM over MW.

    Alternatvely, collapse of the wave-function might be explained dynamically - we think this issue will be settled by experiments, because dynamic collapse theories make different predictions from BQM / standard QM in the yet untested mesoscopic domain.

    With our best regards,

    Authors

    Angelo, Saikat, & Tejinder,

    The information as embodied in particle properties (which can be thought of as internalized rules of behavior, the expression of laws of physics) in a self-creating universe must be the product of a trial-and-error evolution. If fundamental particles have to create themselves, each other and particles only exist to each other if and for as long as they interact, then particles, particle properties, 'its' must be as much the source as the product of their interactions, both cause and effect of a continuous energy / information exchange, then information only can evolve, become information when molded into material particles and tested in actual particle interactions: only such information survives which enables its embodiments to survive, to manifest themselves as real particles. So I don't see how you can have one without (before?) the other, how one can be more fundamental than the other.

    Regards, Anton

      Bohm QM is basically the Schrodinger equation split into a real and imaginary part. There is the Hamilton-Jacobi equation for the real part that includes this quantum potential term

      -∂S/∂t = H - (ħ^2/2m)∇^2R/R

      for the wave function ψ = Re^{-iS/ħ}. That quantum potential is associated with the guidance equation. There is an imaginary part which is formally a continuity equation. None of this is wrong exactly, but I think it is weak. The one problem is that it depends upon classical variables, where we know there are quantum observables that have no classical analogue. In addition quantum mechanics with its complementarity of observables permits one to work exclusively in the position or momentum representation. This is a halving of the number of degrees of freedom a theory needs. Bohm QM brings back the full phase space with {p, q} variables.

      I think this is potentially useful for quantum chaos, for the classical-like structure of this theory is I think better adapted to the techniques in classical perturbation theory and looking at KAM theorem results on puncturing invariant tori. The classical-like particle, called the beable, then traces out chaotic motion. Of course to my way of thinking this beable is really just a mathematical fiction of sorts. It is a gadget used to compute scarring in quantum chaos.

      The BQM is a sort of interpretation. It is meant to get around the cut-off problem with Copenhagen interpretation. MWI has been worked out with Born theorem. I think the big open question is contextuality. An observer is free to orient their Stern-Gerlach apparatus by choice. This selects the eigenbasis of a measurement, but the Kochen-Specker theorem tells us that QM has no such contextuality; QM treats all bases equivalently up to a unitary transformation. So this eigen-splitting of the world in MWI, where different observers record different results, has some sort of implicit contextuality.

      It is my general observation that QM interpretations that are meant to give some dynamics to a measurement, such as Bohm's QM does by tying things closely to classical physics, runs into their own set of difficulties. Quantum interpretations in my opinion are devices that can be employed for different problems, where some interpretation turns out to be more applicable. Now there is the rise of Qubism, which ties QM to Bayes' theorem, but as near as I can tell this ends up being just another interpretation.

      Cheers LC

      Dear Angelo, Saikat and Tejinder,

      I enjoyed reading the overview of the three alternatives to orthodox quantum mechanics. I did not previously know much about trace dynamics, so it was good to find out a bit more about it. It would have been nice if the experimental tests hinted at in the article would have been described a little, possibly including an expected date (if known) for when they are expected to be performed. Also, one might ask if it is possible to motivate the theories a little more. Obviously nature does not need to heed our prejudices, but it would be nice if she could be understood at the most fundamental level in a way that makes sense.

      I wish you all the best,

      Armin

        Dear Anton,

        We are not sure we understand your remarks. Perhaps you refer to a very broad context as to how information relates to matter, interactions and space-time. We have addressed only one aspect of that very broad issue: how the interpretation / understanding of probabilities in quantum theory can dictate the primacy of it over bit or vice versa.

        Best regards,

        Authors

        Dear Armin,

        Thank you for reading our essay and for your comments.

        The experiments have been going on since the nineties, and have put useful bounds on the theories. If you like you could see our recent review article in

        Reviews of Modern Physics 85 (2013) 471

        also available at

        http://arXiv.org/abs/arXiv:1204.4325

        where we discuss these experiments in detail, as also the motivations for the theories.

        Best regards,

        Authors

        Dear Dr. Singh,

        Your argument for an underlying deterministic basis for quantum theory is novel and very well presented!

        Using a Bohmian approach necessitates that there is an underlying quantum wholeness in which absolute probabilities are defined for every possible outcome. One way to approach this is by using quantum information theory.

        The emergence of classical spacetime from coarse graining classical matrix dynamics thus depends upon the conditional entropy of the observer. The measurement process arises out of her ignorance that this is just another aspect of dynamic evolution. Essentially, she erases the entanglement information of the underlying quantum wholeness. (See my essay "A Complex Conjugate It and Bit".)

        In this way, paralleling your theory, the "it" arises from the "bit".

        Best wishes,

        Richard

          Hi Angelo, Saikat, & Tejinder,

          Yes, my critique (and essay) is about what makes information into information as an answer to this question may give a clue as to whether nature at quantum level is random or not. I wonder if the following reasoning might make sense, and I would very much appreciate your answer.

          In classical mechanics (in general relativity and big bang cosmology) particles only are the cause of forces, so here one has to assume the existence of virtual photons and gravitons to transmit forces between real particles. Though the emission and absorption of virtual photons and gravitons to communicate forces between real particles is supposed to be random so their energy fluctuates randomly, they nevertheless obey the Uncertainty Principle [UP] according to which a deviation in the energy of a particle may last shorter as the deviation is greater. This of course begs the question how its neighbors can know when to supply the particle in a timely fashion with energy so it can obey the UP. If in this, classical view, the communication between particles is random, then particles only exist to each other, physically, at the random times they absorb a virtual particle from each other, so they only are intermittently part of each other's interaction horizon, each other's universe. In contrast, if in a Self-Creating Universe [SCU] particles have to create themselves, each other, if they only exist to each other if and for as long as they interact, then to keep existing, they must keep interacting continuously. If in a SCU particles, particle properties ultimately must be as much the source (cause) as the product (effect) of their interactions, of forces between them, then real particles can be thought of as virtual particles which by alternately borrowing and lending each other the energy to exist, force each other to reappear again and again after every disappearance, so they create and un-create each other over and over again without violating any conservation law. As in this scenario the energy sign of a particle alternates, it is a wave phenomenon: the higher the frequency its energy sign alternates (its sign flipping every time an increase turns into a decrease and vice versa), the higher its energy is. Instead of saying that its energy fluctuates randomly, in a SCU a particle exchanges all its energy in every cycle so the UP is just another formulation of the Planck relation E = h v, with v the frequency the a particle oscillates, exchanges energy at.

          It is the continuous energy exchange between particles by means of which they express and preserve each other's properties: preserving the status quo, this continuous exchange of energy aka information is too inconspicuous to be aware of, to assume its existence let alone identify it as the long sought-for 'hidden variables'. According to the UP, the shorter the distance is between particles, the higher the frequency they exchange energy at, the higher their rest energy is. So if the energy of a particle is the superposition of all frequencies it exchanges energy at with all particles within its interaction horizon, a frequency which depends on their mass, distance and motion, then a particle in its 'own' properties contains all relevant information about its entire universe, information which is refreshed in every cycle of its oscillation. Since E = h v states that energy is a quantity which is greater as its rate of change is greater and this rate, the energy of a particle varies within every cycle of its oscillation, then so does the (in)definiteness in both its position and momentum, that is, when we define the mass of a particle to be greater as its position is less indefinite (if the link doesn't work, see: www.quantumgravtity.nl, the chapter 'A definition of mass').

          In this, fully quantum mechanical view, we therefore cannot predict the outcome of particle interactions because we cannot know in what phase the particles are in when they collide or interact, so the probability of quantum theory does not originate in randomness. Randomness only would appear if particle properties would be constant, intrinsic i.e., privately owned, interaction-independent quantities: if they only would be the cause of forces, interactions but not also their product. The present confusion comes from trying to understand quantum mechanics (things like the double-slit experiment) while clinging to outdated, classical notions, in particular the idea of causality: the idea that mass can causally precede gravity, which of course is nonsense.

          Regards, Anton

          Hi Angelo, Saikat, & Tejinder,

          Yes, my critique (and essay) is about what makes information into information as an answer to this question may give a clue as to whether nature at quantum level is random or not. I wonder if the following reasoning might make sense, and I would very much appreciate your answer.

          In classical mechanics (in general relativity and big bang cosmology) particles only are the cause of forces, so here one has to assume the existence of virtual photons and gravitons to transmit forces between real particles. Though the emission and absorption of virtual photons and gravitons to communicate forces between real particles is supposed to be random so their energy fluctuates randomly, they nevertheless obey the Uncertainty Principle [UP] according to which a deviation in the energy of a particle may last shorter as the deviation is greater. This of course begs the question how its neighbors can know when to supply the particle in a timely fashion with energy so it can obey the UP. If in this, classical view, the communication between particles is random, then particles only exist to each other, physically, at the random times they absorb a virtual particle from each other, so they only are intermittently part of each other's interaction horizon, each other's universe. In contrast, if in a Self-Creating Universe [SCU] particles have to create themselves, each other, if they only exist to each other if and for as long as they interact, then to keep existing, they must keep interacting continuously. If in a SCU particles, particle properties ultimately must be as much the source (cause) as the product (effect) of their interactions, of forces between them, then real particles can be thought of as virtual particles which by alternately borrowing and lending each other the energy to exist, force each other to reappear again and again after every disappearance, so they create and un-create each other over and over again without violating any conservation law. As in this scenario the energy sign of a particle alternates, it is a wave phenomenon: the higher the frequency its energy sign alternates (its sign flipping every time an increase turns into a decrease and vice versa), the higher its energy is. Instead of saying that its energy fluctuates randomly, in a SCU a particle exchanges all its energy in every cycle so the UP is just another formulation of the Planck relation E = h v, with v the frequency the a particle oscillates, exchanges energy at.

          It is the continuous energy exchange between particles by means of which they express and preserve each other's properties: preserving the status quo, this continuous exchange of energy aka information is too inconspicuous to be aware of, to assume its existence let alone identify it as the long sought-for 'hidden variables'. According to the UP, the shorter the distance is between particles, the higher the frequency they exchange energy at, the higher their rest energy is. So if the energy of a particle is the superposition of all frequencies it exchanges energy at with all particles within its interaction horizon, a frequency which depends on their mass, distance and motion, then a particle in its 'own' properties contains all relevant information about its entire universe, information which is refreshed in every cycle of its oscillation. Since E = h v states that energy is a quantity which is greater as its rate of change is greater and this rate, the energy of a particle varies within every cycle of its oscillation, then so does the (in)definiteness in both its position and momentum, that is, when we define the mass of a particle to be greater as its position is less indefinite (see: www.quantumgravity.nl, the chapter 'A definition of mass').

          In this, fully quantum mechanical view, we therefore cannot predict the outcome of particle interactions because we cannot know in what phase the particles are in when they collide or interact, so the probability of quantum theory does not originate in randomness. Randomness only would appear if particle properties would be constant, intrinsic i.e., privately owned, interaction-independent quantities: if they only would be the cause of forces, interactions but not also their product. The present confusion comes from trying to understand quantum mechanics (things like the double-slit experiment) while clinging to outdated, classical notions, in particular the idea of causality: the idea that mass can causally precede gravity, which of course is nonsense.

          Regards, Anton

            Dear Tejinder,

            Thank you very much for replying to my post even if it is on my thread. Though I agree that ideas must be quantified in equations so they can be put to test, before quantifying things and risk wasting time on flawed ideas, I first have to make sure that they don't lead to contradictions, that they are philosophically, rationally sound and might possibly agree with observations, or, if not, whether observations can be interpreted differently so they do.

            If a new, good theory expands our understanding like being able to see the world for the first time in color instead of in black, gray and white, then present physics is still charting the world as it has shown itself in color by quantum and relativity theory. The fact that eighty years of efforts haven't solved the present contradictions nor led to an understanding why quantum mechanics works, strongly indicates that answers cannot be found within the current paradigm, formalisms. To solve some of those problems may require a new, different way of looking at things, of thinking about them, a view which may expose some key assumptions of the current paradigm to be invalid, as I argue in my essay and elaborate on in my post of 19 July.

            If and when (as argued in that post) particles, particle properties indeed are as much the cause as the effect of their interactions, of forces between them so a force cannot be either attractive or repulsive, always, of its own, so to say, then this opens up a new, not previously explored path to the unification of forces. As in classical mechanics particle properties are thought to be only the source, the cause of forces, here we need two opposite, independent forces to explain any equilibrium between particles. Such equilibrium not only would be very unstable (unless we can invent a mechanism to avoid this, like asymptotic freedom), as opposite forces must be powered by different, independent sources, i.e., by physically unrelated particle properties, they never can be unified even in principle. As far as I'm aware of, string theory starts from just that classical assumption (never mind the Higgs mechanism) so it will never succeed in what it is intended to do. String theory to me therefore is a prime example of what happens when we allow mathematic formalisms to head our investigations for lack of ideas. Based on a misunderstanding about the nature of mass, of gravity, string theory is only one of the current popular theories which, I think, cannot solve anything but instead are part of the problem.

            With regards, Anton

            Dear Dr. Tejinder Singh,

            I like your multifaceted approach and questioning of the concept of probability. One of my goals is to topple the uncertainty principle as a way to make physics "rational". I think I have made a good start with my essay. It would be wonderful to get your comments on my work, so please visit my blog.

            I am on my way to give your work a good mark.

            Thanks for the thoughtful essay,

            Don Limuti

              Tejinder et al,

              Nice! We are in full agreement that "it" is primary, i.e., IT may have an existence independent of BIT. Whether it *chooses* to do so, however, is the crux of the important question you ask:

              "When is an apparatus classical? Strictly speaking, we do not quite know." And we never have known at what point quantum phenomena are supposed to "smooth out" to become classical; quantum theory is simply incoherent if not infinitely extended. How, then, can it be both coherent and probabilistic?

              Reading the comments, we disagree that Bohmian mechanics is to be preferred over many worlds. A bifurcating multiverse is more satisfying to me because it preserves the topological simple connectedness that I conjecture is necessary to information conservation and classical time reversibility.

              No matter -- superb job, as always!

              Tom

                Dear Angelo,

                You write

                "Here the it comes first [it being the particle / wave-function / matrix] and is well-defined even before the measurement is made."

                This is where, according to quantum mechanics and the theory of measurements, I think, you are wrong. At least, now, I understand why your view contradicts the "it from bit" philosophy.

                Best regards,

                Michel

                Dear Don,

                Greetings and thanks for your kind comments. I very much hope to see your essay in the next few days.

                Best,

                Tejinder

                Thanks a lot Tom,

                One thing that always puzzles me about many worlds is the Born probability rule. If the wave function eternally exists in a linearly superposed state without collapsing then why do observers associate probabilities with measurements? I really do not know the answer to this.

                Best regards,

                Tejinder

                Dear Tejinder

                Thank you for an excellent, interesting, informative and eminently sensible essay, well organized and written in a clear easily read style.

                I agree that only considering outcomes is wrong, although partly for different reasons. If a measurement is an *output* then surely it must also provide information on the input, which we may obtain only if we understand the complete process. Is the lack of that understanding itself not then the real problem?

                Of course the detection and measurement process (observation) must have some effect on the output. I suggest then that just 'bombing Copenhagen off the map' may also destroy some innocent truths about an essential component in a coherent process? While the 3 alternatives you brilliantly describe each have something to offer, none can offer a solution. This is not so much criticism of you essay, which I have marked down for a well deserved top score, but of all current theory; quantum, relativistic, and the 'chasm' (Penrose) between them.

                So I agree your finding 'bit from it' but suggest Wheelers suggestion is naive, perhaps intentionally, because our comprehension of the physical process is entirely inadequate. Why then do we not test mechanisms? I hope you may read my essay as I use the mechanistic approach and logically define detection, computation and measurement as distinct elements of observation. CSL is a key process, and ALL matter qualifies in the role of 'detector' but not all is a 'measurer'. I construct an ontology describing probability, decoding a Bayesian distribution of noise between binary 0 and 1. A powerful new model seems to emerge to overcome the "severe difficulties" and fill the chasm, which I very much hope you can study and comment on for me. Be warned - the odd radical finding emerges!

                But thank you and very well done for your own excellent and important contribution to the process of improving understanding that is 'science'.

                Best wishes and very best of luck in the results. I hope to see you back up in the top 10 as last year.

                Peter

                  Distinguished Professors,

                  "Here the `it' is primary and the `bit' is derived from the `it'." This my essay agrees with but not with such weighty arguments as yours.

                  Research published in the International Journal of Modern Physics, which I read the abstract of, speaks of classical time and quantum time approximations and putting that prediction to test by lab experiments that attempt to construct superposed states of macroscopic objects. For someone with my limited background, that seems impossible at the macro level. I probably would have difficulty following this approach but I am quite curious.

                  Jim

                    Dear All,

                    It is with utmost joy and love that I give you all the cosmological iSeries which spans the entire numerical spectrum from -infinity through 0 to +infinity and the simple principle underlying it is sum of any two consecutive numbers is the next number in the series. 0 is the base seed and i can be any seed between 0 and infinity.

                    iSeries always yields two sub semi series, each of which has 0 as a base seed and 2i as the first seed.

                    One of the sub series is always defined by the equation

                    Sn = 2 * Sn-1 + Sigma (i=2 to n) Sn-i

                    where S0 = 0 and S1 = 2 * i

                    the second sub series is always defined by the equation

                    Sn = 3 * Sn-1 -Sn-2

                    where S0 = 0 and S1 = 2 * i

                    Division of consecutive numbers in each of these subseries always eventually converges on 2.168 which is the Square of 1.618.

                    Union of these series always yields another series which is just a new iSeries of a 2i first seed and can be defined by the universal equation

                    Sn = Sn-1 + Sn-2

                    where S0 = 0 and S1 = 2*i

                    Division of consecutive numbers in the merged series always eventually converges on 1.618 which happens to be the golden ratio "Phi".

                    Fibonacci series is just a subset of the iSeries where the first seed or S1 =1.

                    Examples

                    starting iSeries governed by Sn = Sn-1 + Sn-2

                    where i = 0.5, S0 = 0 and S1 = 0.5

                    -27.5 17 -10.5 6.5 -4 2.5 -1.5 1 -.5 .5 0 .5 .5 1 1.5 2.5 4 6.5 10.5 17 27.5

                    Sub series governed by Sn = 2 * Sn-1 + Sigma (i=2 to n) Sn-i

                    where S0 = 0 and S1 = 2i = 1

                    0 1 2 5 13 34 ...

                    Sub series governed by Sn = 3 * Sn-1 - Sn-2

                    where S0 = 0 and S1 = 2i = 1

                    0 1 3 8 21 55 ...

                    Merged series governed by Sn = Sn-1 + Sn-2 where S0 = 0 and S1 = 2i = 1

                    0 1 1 2 3 5 8 13 21 34 55 ...... (Fibonacci series is a subset of iSeries)

                    The above equations hold true for any value of i, again confirming the singularity of i.

                    As per Antony Ryan's suggestion, a fellow author in this contest, I searched google to see how Fibonacci type series can be used to explain Quantum Mechanics and General Relativity and found an interesting article.

                    d-super.pdf"> The-Fibonacci-code-behind-superstring-theory](https://msel-naschie.com/pdf/The-Fibonacci-code-behin

                    d-super.pdf)

                    Now that I split the Fibonacci series in to two semi series, seems like each of the sub semi series corresponds to QM and GR and together they explain the Quantum Gravity. Seems like this duality is a commonality in nature once relativity takes effect or a series is kicked off. I can draw and analogy and say that this dual series with in the "iSeries" is like the double helix of our DNA. The only commonality between the two series is at the base seed 0 and first seed 1, which are the bits in our binary system.

                    I have put forth the absolute truth in the Theory of everything that universe is an "iSphere" and we humans are capable of perceiving the 4 dimensional 3Sphere aspect of the universe and described it with an equation of S=BM^2.

                    I have also conveyed the absolute mathematical truth of zero = I = infinity and proved the same using the newly found "iSeries" which is a super set of Fibonacci series.

                    All this started with a simple question, who am I?

                    I am drawn out of my self or singularity or i in to existence.

                    I super positioned my self or I to be me.

                    I am one of our kind, I is every one of all kinds.

                    I am Fibonacci series in iSeries

                    I am phi in zero = I = infinity

                    I am 3Sphere in iSphere

                    I am pi in zero = I = infinity

                    I am human and I is GOD (Generator Organizer Destroyer).

                    Love,

                    Sridattadev.

                    Professor Singh

                    Richard Feynman in his Nobel Acceptance Speech (http://www.nobelprize.org/nobel_prizes/physics/laureates/1965/feynman-lecture.html)

                    said: "It always seems odd to me that the fundamental laws of physics, when discovered, can appear in so many different forms that are not apparently identical at first, but with a little mathematical fiddling you can show the relationship. And example of this is the Schrodinger equation and the Heisenberg formulation of quantum mechanics. I don't know why that is - it remains a mystery, but it was something I learned from experience. There is always another way to say the same thing that doesn't look at all like the way you said it before. I don't know what the reason for this is. I think it is somehow a representation of the simplicity of nature."

                    I too believe in the simplicity of nature, and I am glad that Richard Feynman, a Nobel-winning famous physicist, also believe in the same thing I do, but I had come to my belief long before I knew about that particular statement.

                    The belief that "Nature is simple" is however being expressed differently in my essay "Analogical Engine" linked to http://fqxi.org/community/forum/topic/1865 .

                    Specifically though, I said "Planck constant is the Mother of All Dualities" and I put it schematically as: wave-particle ~ quantum-classical ~ gene-protein ~ analogy- reasoning ~ linear-nonlinear ~ connected-notconnected ~ computable-notcomputable ~ mind-body ~ Bit-It ~ variation-selection ~ freedom-determinism ... and so on.

                    Taken two at a time, it can be read as "what quantum is to classical" is similar to (~) "what wave is to particle." You can choose any two from among the multitudes that can be found in our discourses.

                    I could have put Schrodinger wave ontology-Heisenberg particle ontology duality in the list had it comes to my mind!

                    Since "Nature is Analogical", we are free to probe nature in so many different ways. And you have touched some corners of it.

                    Good Luck,

                    Than Tin