• FQXi Essay Contest - Spring, 2017
  • Using Klauder’s Enhanced Quantization to set a bound to the Cosmological constant, in Pre Planckian space- as a way to ascertain the most important fundamental physics question. by Andrew Beckwith

quote

Another question to you is whether or not you consider your approach (which I label for the sake of my question as fully consistent and complete in reference to what we know today about physics) as being necessarily the only one that is able to capture the correct ontology of the universe?

end of quote

An answer which solves the build up of entropy problem identified by Tolman will suffice.

The answer I gave is a means to average out different contributions to entropy levels in the start of a non singular universe. I.e. the average level of entropy per cycle at the start of expansion would be zero.

If one believes the Penrose singularity theorem ( I don't) then (entropy) is set to zero at the start by certain conventions.

Needless to say, the problem Tolman identified with cyclical models is very serious.

Steinhart has his own repeating universe model, which has been partly falsified on the basis of recent observations. But it also tried to solve the build up of entropy problem identified by Tolman in 1934

IMO an answer which fixes the build up of entropy per cosmological cycle will suffice.

As it is, I am going to try to present my own findings in Marcel Grossman 15, and also in Dice, in 2018 in Pisa, Italy.

Any MODEL which solves that problem is worthy of analysis, Stefan

oops I made an error.

Non singular start points to a universe as far as expansion would imply non zero initial entropy.

Singularity, at the start of a universe (Penrose theorem) would IMPLY NO entropy at the start of expansion.

I have some real problems with the Penrose theorem as well as what Hawkings and Ellis said in 1973, in their cambridge university monograph, and will address them in part in Marcel Grossman 15.

Needless to say, if one has a NON singular start to expansion, one has initial non zero entropy, and the Tolman problem of initial increasing entropy levels, is de facto and one has to solve it.

All I am doing, in research is to try to give A SOLUTION to the very real Tolman problem of initial entropy build up, per cosmological cycle.

I salute Steinhardt of Princeton University for his very well thought out attempt to do the same, Stefan.

Any model which solves the initial build up of entropy per cycle, Stefan, is worthy of serious analysis.

I did not say it in my 6 page paper, but I chose the Klauder ENHANCED QUANTIZATION procedure for a cosmological constant, in part, as to how to address the build up of entropy inherent in cyclical universe models

    Hello Mr Beckwith,

    I loved yur barrier between the pre planckian and planckian bubble.It is relevant when we consider that gravitation is the main chief orchetra.

    I don't consider a Big Bang in my model of spherisation with quant and cosm sphères Inside the universal sphere.I consider even a gravitational aether.The dark energy I see it like a simple anti gravitational spherical push. This gravitational aether is probably the answer to tnhis quantum gravitation, and there is a link with your preplanckian era when we consider this gravitation.I ask if the cold is the answer ? have you an idea about this zero absolute and this gravitation ?

    congratulations for your essay, best regards

      I am thinking of gravitation in terms of an emergent field analogy. It is not the same as your suggestion, but you are motivationally not too far off from what started my inquiry

      Also look at this business of NLED (non linear electrodyamics) and GR

      https://arxiv.org/abs/1512.07579

      also see work done by Camara

      https://arxiv.org/abs/astro-ph/0402311

      they consider if the cosmological constant is time independent or time dependent, and go to the regime of a quantum bounce at the start of nucleation of a new universe

      The aether in terms of gravitation may be, as you described it, partly described by

      https://arxiv.org/abs/astro-ph/0402311

      My work is roughly congruent to when one has a time independent cosmological constant, as they describe it

      You may wish to consider if your gravitational aether model is congruent with their work.

      Thanks for your view point and outlook.

      As to the matter of "cold" and an anti gravitational push, all I can say is that various models of the cosmological constant. Dependent upon the initial sign of the cosmological constant, it conceivably could be connected initially as an "anti gravitational" push as you referred to it. That if the sign of the cosmological constant were negative.

      As I diagrammed it out, using enhanced quantization, the cosmological constant initially has a positive sign.

      If the sign were, instead negative, then the idea you have of an anti gravitational push could be entertained.

      That is a matter of further research and speculation though.

      Thanks for your contribution to this discussion

      "Any model which solves the initial build up of entropy per cycle, Stefan, is worthy of serious analysis."

      That's my point of view too, since we all are working on some solutions that could bring us all together closer to truth.

      Good look for your attempt!

      the statement that there is a requirement for a cessation of monotonic increases in the state, initially, of entropy, at the start of repeated cosmological cycles, is a necessary condition as to avoiding the catastrophe as given by Tolman's 1930s cosmology tome which specified that repeating cycles of cosmological rebirth would by necessity create an ever increasing entropy load for successive universes to co exit with, as far as evolution dynamics. The end result is that if there was a perpetual increase in entropy, per cosmological cycle, that , God forbid, the Friedman evolution equations would no longer work.

      I.e. there would be no sense in talking of eternal time.; I.e. cosmological existence would, if there was a cyclical universe, be not a dynamic process.

      The alternative to big crunch, and then steadily increasing levels of entropy, at the start of a new universe cycle, is that there would be an averaging out of entropy, at the start of a new cosmological expansion, as I specified in the multi verse generalization of the cyclic cosmology picture.

      Not specified, though, but one huge issue, to parse would be if the multi verse existed, with different universes contributing to an initial partition function of a newly expanding universe, is do we have constancy in physical law per cycle, and what does that say about the speculation as to if there is a Darwinian process as to creation of new universes?

      See

      https://arxiv.org/ftp/gr-qc/papers/0205/0205119.pdf

      as given by Vaas

      I will spend a lot of time trying to fine tune an answer to this speculation and to come up with a procedure which coheres and admits the possibility of an eternal multiverse, where as considering that individual universes may have a different fate

      I.e. invariance of a Multiverse of perhaps up to an infinite number of different constituent evolving universes.

      An interesting paper, but your formulae are beyond me (as I have only met the standard FRW formulae) so I cannot comment on them. However, you have also made useful explanatory points in your posts which interest me in relation to Penrose's CCC model. May I ask if there is a simple reason why you do not agree with the Penrose resetting of entropy to zero? I ask because I accept it as reasonable, but of course I could very easily be wrong.

      I imply my acceptance of Penrose's CCC in my contest paper and although CCC is important to me it is only a side issue in my paper. It seems to me that there are two ideas at the CCC node of 1) losing the metric and 2) losing the entropy.

      I came late to physics after retirement and my background is in psychometrics. What I knew about making metrics in psychometrics readily led me to accept Penrose's method for losing the metric. I will gladly write more about that if you are interested and not familiar with the Rasch Method of making metrics and also the havoc played by a Guttmann structure of data when trying to make metrics. I am not 100% accepting of when the metric is lost. I think that it could degrade in stages before reaching the node. The issue of the metric in my opinion also affects the entropy issue.

      Best wishes

      Austin

        oops, please do bring up the Rasch method. You are correct. I do not know of it

        thanks

        Andrew

        I am glad you are interested in hearing more on Rasch. The experts are at http://winsteps.com/winsteps.htm and at https://www.rasch.org/ .

        I have used some Rasch programs but am not an expert in writing the model or the software.

        The Rasch model https://www.rasch.org/memo19662.pdf claims to make rating measurements on a ratio scale equivalent to scales in the physical sciences, whereas psychological ratings are usually on a much weaker scale. [The rasch scale is only made in 1D and so is clearly inadequate to make a 3D metric of space for physicists.]

        There is (in my opinion) a paradox at the heart of the Rasch model as a Guttman scale is the target of the model, yet data in a perfect Guttman structure break the model. In a Guttman scale in, say, a football league table, every team in the table beats every team below it in the table. Such data can only give an order of merit and says absolutely nothing wrt the intervals between the teams. For example the top team could be professional adults and all the other teams junior amateurs. The fact that the top team beats all the others says nothing except that they are the best team; not by how much they are better. The implication is that there is error needed in the data for some teams to lose sometimes to inferior teams in order to get a handle on interval sizes. Perfect Guttman data have no error in them. Recently, entanglement is being somehow associated to making judgements of closeness (a Susskind online video, ref?) and to me entanglement implies potential closeness because the two particles were born at the same time and place. Guttman data may be occurring nearing the end of a CCC cycle. What few fermions are left are far scattered and there is probably little error in which fermion is the furthest away. I think the metric could be lost even with some fermions remaining unevaporated rather than waiting for the last fermion and Black Hole to evaporate.

        My contest paper has a reference to my 2016 Rasch paper Pseudo-Random Data Testing The Scales Used In Rasch Pairs Analysis/ Adaptive Comparative Judgement: viXra:1609.0329 ... The paper shows what happens when you try to make metrics when the error in measurement gets reduced. And it shows cases where the metric partially breaks down, for some objects. This corresponds to my idea of a gradual fracture of the metric. [The main aim of that paper was to try to mimic the compression of scales near say a Black Hole. Does the Rasch scale get compressed where the data are actually more compact? It seems to be the case. But that is a different issue.]

        Also, the Rasch metric (in my opinion) does not allow two objects to be arbitrarily close to one another. There seems to be a coarseness of scale depending on the data inputted. So I don't think there is necessarily a danger of infinities arising dependant on division by an infinitely small interval in the metric.

        Lastly, some infinities arise which are not worrying. Say every essay in a contest was rated as 5 out of 10. Unusual but not impossible. The variance is zero and the standardised score for every essay would be infinite. But that is not a worry when the raw ratings are so understandable. Just don't divide them by zero. Likewise as far as I know, all the photons at the end of CCC cycle are in one BEC condensate state. I am not sure how the infinities arise there, probably not by dividing by the variance, but being in one state [and hence low entropy?] doesn't seem so bad to me.

        http://winsteps.com/winsteps.htm and at https://www.rasch.org/

        I tried to access this link but could not. Do you have another link? Thanks

        They work for me, but they are two links, rather than one, to two sources of Rasch software and expertise ...

        http://winsteps.com/winsteps.htm

        and

        https://www.rasch.org/

        Does that help? If not I will look up more sources.

        Best wishes

          yes they do work. Thank you.

          Not a criticism, but the links appear to be linked to data analysis, and can you explain the linkage to cosmology?

          Pardon me being so tone deaf. I have been ill for 20 hours and have been sleeping most of the time

          Tomorrow, I should be able to understand your point

          I sympathise and empathise fully with you and hope you are now getting some sleep. I am in my fourth week of flu. I sent in my contest essay when the flu was at its worst. I went to bed tonight but was too ill to sleep and so am typing this three hours after midnight.

          I think I am not explaining myself well. I am possibly the only person who sees any relevance of Rasch analysis to Penrose's CCC. There are no Rasch papers written, as far as I know, pertaining to cosmology. Rasch analysis is used for tasks such as item analysis in examinations and analysing questionnaire scales. Quite often measurements or ratings get added and averaged etc without much care about the nature of the rating scale. The Rasch analysis aims to improve the quality of the scale of the results, for example by adding or averaging modified ratings rather than adding the raw ratings.

          Forget the previous links that I listed.

          Try the wiki website:

          https://en.wikipedia.org/wiki/Rasch_model

          for an overview of the Rasch model.

          However, the only Rasch paper that I can show you which is not using Rasch in a standard psychometric context is my own paper at

          http://vixra.org/abs/1609.0329

          In that paper you can see a number of metrics made by the Rasch model. Some of these metrics break down. I am suggesting that these metrics break down possibly for the same reason that the metric breaks down at the end of a Penrose CCC cycle. And that reason is the nature of the data is too perfectly Guttman, with too little error in the data. This idea does make a bold assumption that the universe's space metric can somehow be constructed and destructed in a similar manner to running a Rasch analysis! And maybe this is too off-beat a step for you to want to follow it further? If so, that would be understandable.

          Best wishes

          Austin

          quote from your Vixra paper

          This paper shows that a Rasch analysis compresses its location parameter space according to the level of

          uncertainty in making judgements within that space. The more uncertain the judgements, the more compressed

          are the points on the scale. The more uncertain the judgements the more that the location parameters are close to

          one another so that uncertainty in making judgements is equivalent to homogeneity in positions of objects.

          Dear sir, the point of this appears to be connected to the idea of avoiding space-time singularities.

          Is this the interpretation you are seeking?

          Thanks for your input

          You are correct up to but excluding the para beginning "Dear Sir". Thank you for persevering.

          Para beginning "Dear Sir":

          My Table 1 shows three objects have the same location parameter: -0.18. If the objects were 1D fermions this would already have broken Pauli's Exclusion Principle so that that particular metric would have exceeded its maximum content for holding fermions. Also Table 1 uses a finite and small number of different location parameters and that is the feature which I claim prevents intervals between fermions being zero on such a metric, assuming only one fermion per location. I realise that argument could be deemed circular. [My contest paper uses a preon model, with strings, so I do not have singularites for Standard model point particles, as they are divisible in my model. So I do not worry about the location parameters being points.]

          I apologise for my lack of clarity. What I need to do is re-write my Rasch paper to bring in the new physics contexts. And I would understand if you deferred until I finished that paper. There is hardly any discussion in my Rasch paper because of the nature of origin of the paper. In a late use of Rasch in psychometrics before I retired an issue arose over whether one should use standard statistical tests of significance on the rasch results from a particlular experiment, ignoring that the results came from a Rasch analysis. Or could one squeeze more error out of the findings by using extra information from the fact that a Rasch analysis was used. And the paper was written for that psychometric purpose. But I realised that I could try to mimic the effects of GR compressing metrics near masses. So, as I was retired and could do as I pleased, I added that in without any discussion. I have since realised that the same data can be extended to try to show why the metric beaks down near a CCC node but I have not amended the discussion to explain how. It is not fair of me to explain on the hoof but maybe just one more para might help.

          The part of your post that I agreed you had correct was emphasising that uncertainty was equivalent to homogeneity. That is for nearby space. The opposite is true for far flung space approaching a CCC node. That is, lack of homogeneity is equivalent to no error. And 'no error' implies a Guttman structure of data. And a Guttman structure of data plays havoc when constructing metrics. And not just the Rasch metric but any kind of metric, in my opinion. So the metric collapses at the CCC node.

          Best wishes

          Austin

          Dear Professor Andrew Beckwith,

          My research has concluded that Nature must have devised the only permanent real structure of the Universe obtainable for the real Universe existed for millions of years before man and his finite complex informational systems ever appeared on earth. The real physical Universe consists only of one single unified VISIBLE infinite surface occurring eternally in one single infinite dimension that am always illuminated mostly by finite non-surface light.

          Joe Fisher, ORCID ID 0000-0003-3988-8687. Unaffiliated

          Dear Andrew Walcott Beckwith,

          I believe you often attempt to bound phenomena, and to herein derive an explicit bound on the cosmological constant, based on Klauder's enhanced quantization.

          Your equations are impossible to critique (I pity your reviewers!) but your basic concept seems to be that of a space-time "wall" separating pre-Planckian from Planckian regimes. I have difficulty conceiving of such a wall, but then I have difficulty conceiving of lots of things.

          You might wish to read my comment on Klauder's essay page, where I key off of his basis in Dirac to note our friend Steven Kauffmann's paper pointing out nonsense results from the Dirac equation and attributing these to Dirac's consideration of space-time symmetry issues that he used instead of deriving his equation from a corresponding classical Hamiltonian. It is difficult to know just how far this nonsense can or has ricocheted in relativistic quantum field theory.

          Which brings me to my essay that treats the historical development of 'space-time symmetry' and raises questions about it. I hope you will read my essay and comment.

          My very best regards,

          Edwin Eugene Klingman

            4 days later

            quoting upon what I said in your essay discussion

            quote

            Edwin

            I have to commend you on a witty essay, and I liked it enough so I gave you a grade of 8. i.e. very well done

            However, this is my nit.

            The initial time step, call it either delta t, is either intrinsic within a system as done by Barbour in his essay about emergent time, or it is super imposed upon the system say by cyclic cosmological intervention from prior universes upon our present universe.

            In essence, I would like to have a clear distinguishment made between emergent time, as stated by Barbour, or by some other agency, say as in cyclic conformal cosmology (penrose)

            Aside from these nits, I frankly felt your essay was the most enjoyable one I have encountered in this contest and I am saving it as a gem.

            Just because I raise this issue does not mean I disapprove. On the contrary I give you high marks and am asking for an extension of your dialogue to include the distinguishable choice I am referring to.

            Andrew

            end of quote

            Answering you was a pleasure, Edwin, but the choice I made was to include in time as in the form of Barbour,

            https://arxiv.org/pdf/0903.3489.pdf

            And the super structure I used was to focus upon the cosmological constant as I referenced it, as a way to initiate the placing of time as I saw it in the present cosmos.

            Hence, I worked with forming the cosmological constant, as a bench mark for initial conditions enabling the development of time as given by

            https://arxiv.org/pdf/0903.3489.pdf

            What may surprise you. Edwin, was that I initially was to make my essay about time,and shifted to the cosmological constant as referred to in my essay after reviewing what I know of time, as a way to conjecture out an initial structure consistent with

            https://arxiv.org/pdf/0903.3489.pdf