Essay Abstract

Fundamentality is paradoxical with respect to size and complexity and a natural way through that paradox is to identify the nature of fundamentality through the telescopic and microscopic lenses of quasi-fractal repetition of patterns. Featured points are the emergence of mass and space, the dependency of electric charge on strong colour, the nature of substance, the use of a weaker colour in gravitation, the unification of four forces, the generations interpreted as differing in complexity and the missing antimatter maybe hiding in plain sight. Criteria include: persistence of form or substance; indivisibility; invariance; complexity and emergence. Contexts are important such the position, speed and scale of the observer in relation to the potentially-fundamental observed subject.

Author Bio

Retired researcher into school examinations with a UK examining board. My B.Sc. degree is in mathematics and statistics.

Download Essay PDF File

6 days later

Dear austin James fearnley,

You wrote as if you were a staunch United fan that: "A fundamental property should be persistent over time."

My research has concluded that Nature must have devised the only permanent real structure of the Universe obtainable for the real Universe existed for millions of years before man and his finite complex informational systems ever appeared on earth. The real physical Universe consists only of one single unified VISIBLE infinite surface occurring eternally in one single infinite dimension that am always illuminated mostly by finite non-surface light.

Joe Fisher, ORCID ID 0000-0003-3988-8687. Unaffiliated

10 days later

Hi Austin James Fearnley

Your identified "the emergence of mass and space, the dependency of electric charge on strong colour, the nature of substance, the use of a weaker colour in gravitation, the unification of four forces, the generations interpreted as differing in complexity and the missing antimatter maybe hiding in plain sight. Criteria include: persistence of form or substance; indivisibility; invariance; complexity and emergence. Contexts are important such the position, speed and scale of the observer in relation to the potentially-fundamental observed subject." is really a exhaustive list for the Fundamentality in particle physics and cosmology I think Resp Austin James fearnley

.....I highly appreciate your essay and I request you please spend some of the valuable time on Dynamic Universe Model also and give your some of the valuable & esteemed guidance

Some of the Main foundational points of Dynamic Universe Model :

-No Isotropy

-No Homogeneity

-No Space-time continuum

-Non-uniform density of matter, universe is lumpy

-No singularities

-No collisions between bodies

-No blackholes

-No warm holes

-No Bigbang

-No repulsion between distant Galaxies

-Non-empty Universe

-No imaginary or negative time axis

-No imaginary X, Y, Z axes

-No differential and Integral Equations mathematically

-No General Relativity and Model does not reduce to GR on any condition

-No Creation of matter like Bigbang or steady-state models

-No many mini Bigbangs

-No Missing Mass / Dark matter

-No Dark energy

-No Bigbang generated CMB detected

-No Multi-verses

Here:

-Accelerating Expanding universe with 33% Blue shifted Galaxies

-Newton's Gravitation law works everywhere in the same way

-All bodies dynamically moving

-All bodies move in dynamic Equilibrium

-Closed universe model no light or bodies will go away from universe

-Single Universe no baby universes

-Time is linear as observed on earth, moving forward only

-Independent x,y,z coordinate axes and Time axis no interdependencies between axes..

-UGF (Universal Gravitational Force) calculated on every point-mass

-Tensors (Linear) used for giving UNIQUE solutions for each time step

-Uses everyday physics as achievable by engineering

-21000 linear equations are used in an Excel sheet

-Computerized calculations uses 16 decimal digit accuracy

-Data mining and data warehousing techniques are used for data extraction from large amounts of data.

- Many predictions of Dynamic Universe Model came true....Have a look at

http://vaksdynamicuniversemodel.blogspot.in/p/blog-page_15.html

I request you to please have a look at my essay also, and give some of your esteemed criticism for your information........

Dynamic Universe Model says that the energy in the form of electromagnetic radiation passing grazingly near any gravitating mass changes its in frequency and finally will convert into neutrinos (mass). We all know that there is no experiment or quest in this direction. Energy conversion happens from mass to energy with the famous E=mC2, the other side of this conversion was not thought off. This is a new fundamental prediction by Dynamic Universe Model, a foundational quest in the area of Astrophysics and Cosmology.

In accordance with Dynamic Universe Model frequency shift happens on both the sides of spectrum when any electromagnetic radiation passes grazingly near gravitating mass. With this new verification, we will open a new frontier that will unlock a way for formation of the basis for continual Nucleosynthesis (continuous formation of elements) in our Universe. Amount of frequency shift will depend on relative velocity difference. All the papers of author can be downloaded from "http://vaksdynamicuniversemodel.blogspot.in/ "

I request you to please post your reply in my essay also, so that I can get an intimation that you replied

Best

=snp

Dear Austin,

An interesting essay. You might be interested in my electron/positron model. See my paper here:

http://vixra.org/abs/1507.0054

I think you are in need of a boost in the ratings, so I have given you a score of 10 to help...

Best Regards,

Declan

  • [deleted]

Declan

Thank you kindly for your interest in my paper and also for the rating . As I understand it, the rules prohibit any discussing of actual scores so I cannot reply in kind in this discussion but you will note that essays are to be marked on interest and relevance and of course I found your essay to be high in both qualities though I have not made any ratings yet for anybody. But assuredly I will do so.

I have looked at your vixra paper on your electron/positron model and it seems interesting but I need to look further. In my preon model (though I have not covered this point in my last three model numbers. [Need to go back to around model #3]) there is a spinning triple helix structure which holds together hexarks within the preons. From a string theory p.o.v. that basically means that at the heart of each preon there are multiple attachments to three Red, Green and Blue colour branes spinning around one another at relativistic speed in a maybe Hopf fibration. This spinning needs an even stronger force than strong colour to hold it together so is unlikely to be related to the spinning in your vixra paper. My preons are attracted to one another by QED as preons need to dissociate in QED interactions so maybe I can think about how preon interactons could connect to your paper.

I still have a couple of points about your contest paper which I will likely post in your thread. But I have not got to grips with the Steering Inequality yet. The Steering Inequality is a theoretical device (2011 paper) which is not part of the recent experimental results (2015 paper)? The 2015 paper did not mention 'steering' AFAIK and anyway they only had 245 pairs of outcomes. And surely they did not contaminate an experimental finding with results other than +1 or -1? I presume the steering inequality implies that experimental data were discarded or not measured for some genuine pairs which would have reduced the correlation absolute size? Or am I missing something from the 2015 experimental design?

Best

Austin

I will read you essay as seen as time permits. I wrote the following on my area in response to your comments:

The quantum hair appears on the event horizon, or really the quantum membrane called the stretched horizon, as seen by a stationary observer. For a distant observer the tortoise or time delayed coordinate r' = r - 2m ln|r' - 2m| means these are redshifted enormously. However, an observer on an accelerated frame close to the horizon will observe more of this physics. Of course this requires an observer or probe on a frame with an enormous acceleration up to 10^{30}m/s^2.

In the collision of black holes quantum hair participates in the production of gravitons or quantum signatures in gravitational radiation. This is one big thrust of my essay; quantum gravitational signatures at the foundations of the universe are potentially detectable.

There is this problem with how gravitation and quantum mechanics merge or function in a single system. It is often said we understand nothing of quantum gravity, and this is not quite so. Even with the based canonical quantization of gravity from the 1970s in a weak limit is computable and tells you something. This theoretical understanding is very limited and big open questions remain. Of course since then far more progress has been made. The AdS/CFT correspondence, the Raamsdonk equivalence between entanglement and spacetime and the RT formula are some of the more recent developments. These indicate how spacetime physics has a correspondence or maybe equivalency with quantum mechanics or quantum Yang-Mills fields. However, an obstruction exists that appears very stubborn.

The vacuum is filled with virtual pairs of fields. With a black hole the gravity field causes one of these pairs to fall into the black hole and the other to escape. This means the quantum particle or photon that escapes as Hawking radiation is entangled with the pair that falls into the black hole, and so this means Hawking radiation is entangled with the black hole. So at first blush there seems to be no problem. However, if we think of a thermal cavity heated to high temperature photons that escape are entangled with quantum states of atoms composing the cavity. Once the entanglement entropy reaches a maximum at half the energy released the subsequent photons released are entangled with prior photons released. This would hold with black holes as well, but because of the virtual pair nature of this radiation it means Hawking radiation previously emitted in a bipartite entanglement are now entangled not just with the black hole, but with more recently emitted radiation as well. This means a bipartite entanglement is transformed into a tripartite entanglement. Such transformations are not permitted by quantum unitary evolution. This is called quantum monogamy requirement, and what this suggests is unitarity fails. To prevent the failure of quantum mechanics some proposed a firewall that violates the equivalency principle. This is called a firewall.

The firewall occurs when half the possible radiation is emitted, which is also the Page time. This also corresponds to the failure of a quantum error correction code. Error correction codes involve some deep mathematics; it is connected with the RT formula and I illustrate in my essay the connection with Mirzakhani's mathematics on the geodesics in hyperbolic spaces. Error correction is also tied with the packing of spheres or how oranges stack at the grocery store, the Kepler problem. This gets into the guts of what my paper is about. However focusing in an error correction corrects the mixing of information. Think of a library, in particular an elementary school library with little kids, and the patrons scramble up the order of books. The distance a books ends up from its right position is the Hamming distance. As the library gets mixed up an algorithm can manage this disordering. However, at about half mixing up things break down. The librarian has to virtually start over.

The solution with Susskind and others is to say spacetime variables and quantum states are equivalent. I do not disagree completely, but I think this is a complementarity instead of an equivalency. It means with either spacetime or quantum states you can account for the system, but at the expense of abandoning a description of the system by the other. You can't describe quantum gravity completely by both in the same measurement description. So this is a sort of Heisenberg uncertainty, if you will.

Cheers LC

    Dear Lawrence

    [I have sent this post to both our threads.]

    Thank you very much for your reply. Making and breaking of space metrics are a minor part of my essay (pages 3 and 5). My employment background includes making metrics for examination scores using Rasch pairs analysis [fortunately I never encountered FQXI-style 1-bombing ratings there]. Obviously trying to make a connection between psychometric metrics and the spacetime metrics of BH hairs is a long and tenuous stretch. I have followed all of Susskind's online "theoretical minimum" courses including SR, GR and cosmology which includes BHs but only the starting point basics. I have read your reply but will need to work at it extensively to follow it. I still have some points though which you might kindly clarify.

    You mention the Raamsdonk equivalence between entanglement and spacetime:that equivalence sounds somewhat similar in aim to an idea I wrote in sci.physics .foundations in 2011.

    ".... two entangled binary spins of electrons with random total spin, but perfectly correlated within the pair, seems a little like looking at the raw data for a rasch [pairs] analysis. ... Surely the binary spin data can't be the raw decisions which determine the emergent space [?]..."

    https://groups.google.com/forum/#!msg/sci.physics.foundations/UIpgAj43QXg/lmXQajBksZUJ

    I tried the idea in a Rasch pairs analysis soon afterwards and reported it in my 2016 paper at http://vixra.org/abs/1609.0329

    where I was trying to see if I could compress the emergent space metric [arising from a Rasch pairs analysis] near a large 'mass': it seemed to work OK. Alas, the paper is not aimed at physics and has hardly any discussion. I need to re-write it to discus GR and CCC physics.

    For examinations one often has two metrics which one needs to link together, such as for two parallel alternative tests. They need to be linked for comparability of grading outcomes. One way is to put a small amount of overlapping data in both of the alternative tests.

    In the BH context this amounts to having pairs of entangled particles in both spaces simultaneously i.e. both inside and outside the BH simultaneously. But both entangled particles need to be in both spaces. I cannot simulate, in a Rasch pairs analysis, the use of only half of one pair in one metric. So I suspect that it may be impossible in general to use entangled pairs to make a metric when the two single particles, of an entangled pair, are in different spaces.

    At a Penrose CCC node, all the stuff in the universe is in the form of photons in a single state at almost infinite wavelengths. The metric breaks down at the node in his model and I agree with that. But the stuff in the photons continues to exist even though the metric has gone. So a broken metric does not imply destruction of the 'stuff' in the old metric.

    Page time etc is all new to me, but I am familiar with a metric breaking down gradually (your library books analogy). I suggested in my essay that the breakdown of the space metric may occur gradually before the CCC node is reached. The metrics I produced in my 2016 paper (ref above) show that, in special circumstances, some pairs of data do not get included in the metric. And sometimes the metric fails completely to plot any data. And I agree that this is connected to a sort of Heisenburg uncertainty. But in my psychometric area the problem is referred to as the problem of Guttman data. Guttman data kills metrics. And Guttman data is data with zero uncertainty.

    As an example one could use Rasch pairs analysis to construct a metric for ratings in this contest. One datum point is where essay A is deemed to be better than essay B. Deemed, that is, by contestant C. Repeat for all pairs of essays and all contestants as raters. The metric would be most compressed if all the ratings of 'better' or 'worse' were at random. Corresponding to large uncertainty. But on the other hand, if every rater put the scripts is the same order of merit as every other rater than the analysis would collapse because there was no error in the system: corresponding to Guttman data and zero uncertainty. This is a nice explanation for the need for the existence of uncertainty as we would not be placed in our space metric without it.

    Dear Fellow Essayists

    This will be my final plea for fair treatment.,

    Reliable evidence exists that proves that the surface of the earth was formed millions of years before man and his utterly complex finite informational systems ever appeared on that surface. It logically follows that Nature must have permanently devised the only single physical construct of earth allowable.

    All objects, be they solid, liquid, or vaporous have always had a visible surface. This is because the real Universe must consist only of one single unified VISIBLE infinite surface occurring eternally in one single infinite dimension that am always illuminated mostly by finite non-surface light.

    Only the truth can set you free.

    Joe Fisher, Realist

    I read your paper finally. There are a few points I can make. You have some issue with the spin of the graviton. It must be spin 2徴 because the classical gravitational wave has two helicity states. It is as if the graviton is a sort of diphoton, such as an entangled photon state or a HBT effect photon state, or two gauge particles. The electric field vectors which form the polarization direction is such that a diphoton has two electric fields. Similarly a gravitational wave has two directions.

    Preons and Rishons have as you indicate early on issues that are difficult. The problem is that the mass of these particles is small while the gauge potential of their interactions is large. In QCD the masses of quarks is on the order of 10s of MeV while the masses of baryons is 1 GeV or more. This makes the renormalization problem difficult, for which 't Hooft and Veltmann won the Nobel Prize for working out. If quarks and leptons are composed of rishons or preons the problem is far more difficult.

    We will have to see if leptons and quarks are indeed composites, where in QFT the idea of what is composite is somewhat difficult and problematic to define. There are experiments that measure the Lande g-factor of the electron to attempt to determine if there are deviations in the radial form of the electric field and the nature of the spin based of the magnetic field. So far data indicates the electron is point-like down to within 10 orders of magnitude of the Planck scale. Presumably other leptons and quarks are structured or not similarly.

    LC

    I just boosted your score by .3 points. Your paper is in some ways a diamond in the rough. It does show some insight and thought. My opinion of preons and rishons is that I keep these in a archive just in case it turns out experiments show the electron has some structure. This would occur if the Lande g-factor deviates from QED prediction.

    Cheers LC

    Lawrence

    Thank you for the rating and even more so for your comment. I really appreciate that!

    You raised three issues with my model and noted that I had acknowledged the first in my paper:

    Yes, renormalisation will be difficult to establish for preons ... but pre-1970s it was also difficult in QED/QCD. Humans have historically put themselves in pride of place at the centre of things: earth-centre universe & gods made in man's image. When I was young, I worried emotionally rather than scientifically over why I was 'where' I was and why I was 'when' I was. That is probably why I now have an intuition that we just happen to be the scale we are but do not rule out things happening on much different scales.

    There were half a dozen topics that I had no space to include in my essay. One of them is relevant here and one later on. In my model, particles collapse their wavefunctions much like the universe comes to a CCC node. So an electron can have multiple components so long as those components can be brought to a single and final BEC state. That does not look easy for the preons in an electron say ACBB'. Preon B and antipreon B' could be brought to a single bosonic state. C is bosonic so that could obtain a final single state but A is fermionic I am not clear if that allows the whole electron to be in one final state. Bringing the universe to a CCC node of one state may perhaps be easier than bringing an electron to a final single state. (Despite the apparently harder problem for the electron being contained within the apparently easier one for the universe.)

    I agree that gravitation is the weakest part of my model. But it is also the newest part. In model #7 I had gravitons as spin 2 particles but changed it to spin 1 particles for model #8. I was previously told by a physicist that noone would buy spin 1 but persevered anyway! I understand the need for spin 2 where positive mass is the only charge but I abhor the idea of mass as charge. I could go back to using spin 2 and use 'emergent' mass as the charge. I should think about whether a charge has to be truly 'fundamental' in my model before it could be thus used. That seems very relevant to the aim of this contest about fundamentality. Also, using spin 2 and mass seems to downplay the idea of unification of four forces at high energy/early universe. Also, GM matches tensorial input for gravitation to tensorial output effects, all macroscopic. I am not so sure you need to put tensors in at microscopic level to get tensorial effects out at macroscopic level. But it is early days for me on that issue.

    ..........

    With respect to your paper on BH hairs, thank you for your pointer at Strominger's papers for me to get an introduction. I have already looked at one and watched his online video of July 2017. I was struck almost immediately at a similarity between BH and CCC use of soft photons. BHs seem to be more structurally interesting at a horizon than CCC is at a node and I will enjoy comparing the two entities. I noted in the video that Strominger said that he would be attending a conference entirely about probability amplitudes and that many of the features investigated were unphysical giving a hint that they could not square any of these amplitudes to make probability densities. That says to me that there are no metrics available for these features and that consequently there is no spacetime for them. But all is not lost as the amplitudes exist, even if only in abstract space (just as stuff in the photons survive loss of metric at a CCC node).

    I would like to ask a slightly different question about spacetimes and metrics than before. In my model, nothing happens by chance. Particle/antiparticle pairs are created as outputs of an interaction where inputs were vacuum particles/fields. Interactions of particles always happen in a spacetime, and I assume that interactions of fields can happen in the abstract space without a spacetime. The created fermion pair (for Hawking radiation) needs to be created in spacetime as I assume they are not virtual loops. So in the limit as the (stretched) horizon is asymptotically approached there is less and less chance of a fermion pair being created. Switching to the CCC model, that model needs all fermions to disappear as the photons get softer and softer, and random fermion pair creation would lessen the chances of obtaining a CCC node. In my model what comes out of an interaction matches (as preons) what goes in. So the idea that you can get any particle pair randomly and spontaneously arising out of the vacuum is not correct. To get fermion pairs out one needs to have the correct energy and availability of preons in the vacuum to begin with. This may appear to be completely at random in normal space in the laboratory but I am suspicious, this time, of creating sufficient numbers of fermions pairs close to the stretched horizon to feed Hawking radiation. I intend to read a lot more about BH hairs.

    I have re-read your paper and rated it highly. You do not need to reply if my BH comments are too naive for you to bother with.

    One can think of a graviton as the triplet entangled state of two spin 1 bosons. The quantum number for a colorless or chargeless entanglement of gauge bosons of the form 1/sqrt{2}[|++) + |--)] shares the same quantum numbers as a graviton. The graviton is the some sort of STU dual form of entanglements of gauge bosons or gluons.

    cheers LC

    18 days later

    Dear Austin James Fearnley,

    I have read your Essay wherein you mention that 'Quantum spin (S) is a fundamental property of particles'.

    Quantum Mechanics claims that an electron can be both spin-up and spin-down at the same time. In my conceptual physics Essay on Electron Spin, I have proved that this is not true. Please read: https://fqxi.org/community/forum/topic/3145 or https://fqxi.org/data/essay-contest-files/Rajpal_1306.0141v3.pdf

    Kamal Rajpal

    Dear Austin,

    I highly appreciate your beautifully written essay.

    I completely agree with you. «The vacuum is clearly not a void as the vacuum of space contains fields». «In a pure fractal pattern, such as the Mandelbrot figure, patterns repeat at different scales but the same pattern is not everywhere as one has to search for the repetitions which are separated in space».

    I hope that my modest achievements can be information for reflection for you.

    Vladimir Fedorov

    https://fqxi.org/community/forum/topic/3080

    Dear Austin,

    (copy to yours and mine)

    Many thanks for the kind words and interest shown in my work.

    I also have read and appreciated highly your essay.

    You are listening well to the "music" of our universe.

    I wish you happiness in your scientific work in search of truth.

    Vladimir Fedorov

    https://fqxi.org/community/forum/topic/3080

    Hello Austin,

    Excellent essay and comprehensive theory. I do not have and answer to what constitutes an electron. However, I think I can provide some insight in other areas.

    1. "Does light need a medium to maintain its speed?" Yes, see my diagram of dark energy, it is the ether.

    2. "Quantum Theory (QT)and General Relativity(GR)are both superb in their results, but something has to give in melding them. GR uses tensors but it has to do so to get correct results because it has no fundamental access to a mass charge in the micro realm of particles"......... The graviton (in my theory) has mass, just as a guitar string has mass. Light (with no mass) runs on massive gravitons like vibrations run on massive guitar strings.

    Graviton mass:

    a. is the ether

    b. takes the form of either dark matter or dark energy

    c. is the carrier of light.

    3. Why is the speed of light independent of relative motion? ....... Because the ether (gravitons) moves with the observer. Michelson-Morley believed they were moving with respect to the ether. I believe they carried it with them as all observers do!

    Yes, very speculative stuff. Got any good experiments? If there is time visit my essay and let me know what you think.

    Thanks for your thought provoking essay.

    Don Limuti

    Write a Reply...