Quasi-Dimensionality: An Accidental Blueprint of a Unified Field Theory By: Mykel A. Waggoner

An introduction to Quasi-Dimensional Dynamics

For as long as we can remember, we have been taught to believe that

dimensionality was meant to be organized in a way that is both linear and

hierarchical in nature, even though the tendency to rely on such a system

has led to the stagnation in progress that we have found ourselves in at

this point in time; which is why I am going to suggest that the method

needed to liberate all of us from this predicament, would be to consider

that dimensions are not based on a specific set of possible formations, but

that it is the result of those formations being used to influence the manner

to which our perception will allow for the particular designs that may take

shape to affect what is to emanate from that occurrence. And so, that very

process is going to entail the effect which has always enabled the

development of any dimension: Time. Being that the continuous act of

preventing any further movement from transpiring throughout any

moment that time is prone to doing, so that it can utilize anything

contained with that moment for a certain purpose, will surely function as

the mechanism involved with the construction of any kind of dimension.

That's why we haven't been able to construct dimensions of the fourth

variety and so on; for our approach to doing so was devoid of recognizing

that geometries are formed according to the rate of movement that an

object is able to track and measure from its unique vantage point, with

that movement occurring at such a rate, that the velocity used to ensure

this would mean that a multitude of movements would be able to happen

at seemingly the same instance. Therefore, a tesseract, for example,

would need to have a cube captured in time (framerate), such as that of

being generated through a simulation, and another cube within the same

condition, in order to have the both of them oscillate while overlapping

each other at high enough speeds to produce the effect of seeing each of them as being the same and neither or- thereby explaining why this shape

would consist of possessing several cubes and how that can be done. Yet

the ability to do this wouldn't be limited to that of dimensions beyond the

third, but with the first, second, and third, as a way to make shapes that

aren't conventional by any means. After all, a tesseract is comprised of

multiple cubes, and those cubes are comprised of multiple planes, so

those planes are the byproduct of the axes that were used to make them,

and if the blending of shapes can lead to different shapes, then the same

can be said of the materials contingent to the formation of those shapes,

which would mean that an axis is capable of being more than just a line or

curvature. Hence that of it being feasible to have the bounds that would

constitute for a dimension to become entirely blurred if not completely

removed from having any actual relevance and sustainability to it.

Furthermore, the reason that we haven't realized this any sooner, has

been due to the habitual attempt to fit everything into a single frame, or

to believe that there wouldn't be a way to connect various framerates

together to create an entirely unprecedented infrastructure involving

them, even though animation or movement of any kind is dependent on

this being achievable. Considering that a shape or object of three

dimensions will always require a continuous amount of two dimensional

planes to stack themselves on top of each other in a sense, to create the

effect of depth, which would then allude to the notion of space being both

interconnected and interchangeable with any other representation of

space, and that the formation of space is simultaneously a malleable and

fixed endeavor; which is what the Flower of Life was alluding to: the

development and perpetuation of dimensionality, through the need to

have each dimension, represented as a circle, embody both a sense of

individuality and unification, so that a new expression of such can form

from the process of intersection taking place. All in all, just as the effect of

movement is able to occur because of the momentum being built by the

continuous act of having framerates transition from one to the other, so too does this happen for energy, with the amount of framerates that are

to oscillate per unit of time being the determining factor to how that

energy will function and express itself, along with how much energy as

well. Lastly, I would like to present the idea of multidimensional speed,

which serves to be an extension of the concept of framerates and their

purpose. For starters, the reason that there are even directions available

for dimensions to use for the sake of their formation, is because of the

way that framerates will appear and disappear at their own rates, as

though they were moving to a personal sense of cadence, with there being

so many framerates fluctuating at different velocities that a symphony

would be performed at the quantum level. So in the process of that, an

axis of any variety, whether as a line, curvature, or something else entirely

would stem from causing framerate oscillations to fluctuate at the

necessary rate for several framerates to move coherently in a

synchronized fashion, for if not, there would be the perception of

framerates popping in and out at a seemingly random rate, and at places

that would be unpredictable at best; which means that the intervals used

to gauge the distance in rates between the oscillations of framerates

would have to be consistent with one another, if it's to work- the tempo of

sorts. Suggesting that speed is instrumental to the formation of directions,

and as such, dimensions, because speed itself is in possession of directions

and dimensions of its own; in essence, making this so, through applying

the approach that was previously stated about how to utilize framerates

for the intent of designing tesseracts, to then provide speed with the

capacity to compound its speed until its functionality is operating with a

sense of multiplicity unlike what was known for it to have. Such as the

endeavor of having several cubes overlap each other to produce a

hypercube, the same can be done with speed, in order to make it possible

for any number used to measure the rate of fluctuations taking place with

framerates, to acquire the ability to act as multiple numbers at once, or a

superpositioned number as I prefer to call it. Inevitably granting individual frequencies the means to oscillate at a rate that would allow for the speed

of such to move as if it were to be in the midst of mapping out the entirety

of a shape in real time, all while doing whatever else that the use of the

velocity of the oscillating framerates is being intended for. Or to put it

differently, the speed of the frequency involved with measuring the rate of

fluctuations with the framerates can also be determined by having each

number be treated as though it were vibrating, with that vibration

representing its embodiment as numbers, along with there being

additional numbers for it to be moved to, so as to create a

multidimensionality with the vibrational component of these numbers,

and therefore, the frequencies that will be accessible because of this

phenomenon. Given that the numbers set as coordinates for both cubes of

the tesseract constructed through this method, would have no other

option but to fluctuate as well, even if the numbers were to be the same

numbers for each cube, which proves that these numbers are not only to

induce a form of oscillation, but that all numbers are probably

experiencing that very thing, unbeknownst to most, because the

conditions were never devised to place numbers within the position of

being analyzed in such a way. And with that being said, what to take away

from all of this, is that numbers can have just as much a sense of

dimensionality to them as dimensions can have numbers be applied to

them in a similar form; as the amount of numbers and dimensions that can

form from this will grow into infinitum, because their interdependent use

of each other enables them to have mechanisms to classify all of the

individual variations that can be devised. Although, if a number were to

find itself fluctuating between that of several numbers, all of which are

numbers that are operating under the notion of their preconceived ideas

of structure- the conventional sequence for numbers to have which allows

for the field of mathematics to function in the current way that it is able to

do so- they would need to have their own system to rely upon; which is

why I imagine that numbers like these are also capable of supporting techniques for moving themselves along their specific version of a number

line through using any form of math like arithmetic or algebra to the more

complex varieties of such, in the same way that it would be accomplished

with ordinary or non-fluctuating numbers, just by being willing to be

comfortable with their structure lacking what is familiar to us; even

though we were already introduced to that structure, simply from having

to use the one that we are always involving ourselves with. But I'm sure

that it doesn't seem that way to most, so I'll describe it like this: If all the

numbers that we use, which are rigid and focused into a singular identity,

were to actually be in a constant state of shifting between numerous

expressions of such, then that would mean that the structure of numbers

that we are often using, is without a doubt, arbitrary in its design. And if

that's so, then it would be plausible to consider that the order of anything

is entirely the effect caused by having been exposed to a system with

enough frequency, that the instinctive tendency to predict potential

scenarios or conclusions, as a way to cultivate a sense of trust with that

system, through actively injecting our grasp of its dynamics to whatever is

going on with it, only demonstrates that our desire to adhere to the idea

that any modality, even mathematics itself, is something which is absolute

and infallible, is why we have been able to use it to the extent that we

have thus far. However, the reliance to that methodology has led to

obscuring any chance of recognizing an alternative path. That is, until a

moment when it can be realized that a methodology of any kind is surely

to be settled on the foundation of manufactured continuity, the same way

that any two framerates can technically be positioned near enough to

each other in duration, that the message of their connection would be

conveyed, according to the vantage point of the observer- often at the

expense of that individual being able to catch a glimpse at the

innumerable amount of oscillations conducive to making it feasible for

anything to be perceived, because of the fluctuations needed to have a

line form through the perception of multiple points emanating from that to produce the sight of a line, or a curve being formed through the same

process, just with lines instead of dots, regardless of curves also being

embedded within lines- but I digress. The real evidence to all of this entails

envisioning a curve as a single framerate, with a line being close enough in

position to where the curve would have been, had it been its framerate

which was being utilized at the time, to make it doable to have these

framerates oscillate like a tesseract is being formed, all so that an axis type

that is neither just a line or a curve is usable; yet the proof isn't solely due

to it being existent. It is because a point placed along the path of this axis

type would mean that it is always going to be simultaneously aligned with

a line and curve, as both states are relentlessly in flux; and if that is able to happen, then any axis type, shape, or dimension is exempt from being able

to claim that its formation stems from it doing anything other than that. In

a nutshell, the fluctuation of framerates occurring per unit of time needed

for the formation of any dimension to take place is just like the idea of

there being a dot, a space between that dot, and the dot that is to follow

it, with that space between the dots acting as the symbol and the effect of

the time that wasn't measured; for it's believed that dimensionality is

about finding a way to fit multiple points of perspective next and around

each other, when it's really about discovering the means to do that

through having them overlap each other- like having various axes

intersecting to form a shape. P.S. Although framerates can just as easily be described as frames, like numbers, framerates are defined by the frames

that they are in relation to, which means that they are constantly in a state

of being both a collective and individualized embodiment of information.

Quantum Force Theory

Instead of there being Dark Matter or Dark Energy, it is just as plausible to

consider that the mass of an object will always shift in response to what it

is doing and how it is being affected by its environment. As the mass of an

object will switch into various states of being, as though they were

functioning in accordance to something similar to the binary model,

except with the amount of options for such being greater than just that of

either 0 or 1. And if that particular dynamic is the reason for the seemingly

arbitrary increase or decrease in mass within any specific object, then it

would behoove us to imagine that the phenomenon of Quantum

Entanglement is involved; that the entanglement between multiple

celestial bodies are instrumental to the compounding effect of their

masses, which ensures that their masses will fluctuate as a result of any

difference that might take place within its environment- thereby making

gravity and quantum fields interdependently in reliance of each other. Or

it's just as much likely that all the forces are intertwined with each other

due to Quantum Entanglement as well. So in that sense, it's probably the

case that Quantum Entanglement is the effect of having the wavelengths

of several particles resonate with each other by oscillating the frequencies

of those wavelengths at such a rate that with each passing moment in

time that the oscillation process of resonance is able to occur, the power

of each particle or component of that resonance would be able to grow or

diminish in strength, depending on the rate of that oscillation process,

which means that the force formed and emitted from any particle is going

to be the result of the oscillation process of resonance that is Quantum

Entanglement.

The Meaning of Quantum Entanglement

What most people who are familiar with the concept of Quantum

Entanglement will think about when they ponder the existence of it is that

of the notion of several particles being interconnected to such an extent

that the shift in spin of either one of them will lead to the other particle

being affected in the same manner, instantaneously. But what they aren't

able to deduce from that is the possibility that it's not just the spin that is

being affected, as well as it being feasible that the particle which was

observed as having been affected by the shift in spin from another

particle, then is functioning as an overt, specific, and limited

representation of the effects of Quantum Entanglement, with an

expanded account of its effects being that of any particle having the

capacity to affect any other particle in any particular way, and not just in

regards to its spin. This means that an interaction between any group of

particles is the result of them being entangled with each other; and by

that being so, the process involved with writing code, for instance, is

basically dependent on this being an accurate assessment of the

phenomenon that is Quantum Entanglement, otherwise the programming

of any software would fail to occur in response to this being the way that

it is. And that's because in Quantum Entanglement, information is

immediately shared between several particles through having the spin of

an individual particle lead to the altered spin of another particle. So if such

a thing were to happen, then the effects of repulsion or attraction

between particles, just for example, would suggest that the transference

of data is inherent within the process itself; that just might make it so that

the entanglement of several particles is somehow the byproduct of an

individual particle having an aspect of its frequency vibrating at the

necessary amount to have it be in resonance with the frequency of the other particle that it is affecting, which in turn would have it function like a

GPS device that is honing in on the coordinates of the other particle, with

other ways of affecting a particle being something which stems from a

different part of their frequency being in resonance with each other;

essentially painting the picture of each particle being something which

consists of an entire code within itself, just like any strand of DNA would-

as the particle's equivalent of genes would serve to function as the

determining factor to whether or not another particle will respond to the

initial particle in a specific manner. For it's similar to using a radio in order to have itself tune to the frequency of the radio broadcast, as the

matching of these frequencies between the listener and transmitter is

enough to ensure that the information is able to convey itself, except with

the difference being in the way that the capacity for transmission and

reception is present on both sides. What's also worth noting is that each

particle is not in possession of or embodying just a single frequency, it is

constantly in the midst of hosting a series of frequencies, with one of

those frequencies being the same as the frequency of any particle that a

particle would be able to interact with, and potentially serve to disrupt an

overt case of Quantum Entanglement between several particles, while at

the same time having another one of the frequencies of any specific

particle being different enough to the several particles entangled with

each other to help that particle with shifting the frequency of the

entangled particles, entirely because of the part of the collection of

frequencies of the particle which was different to the overall frequency of

the entangled particles' frequency. I also imagine that the specific way

that each of the frequencies of a particular particle will interact with each

other, in the end is what determines the exact force that a particle will

emit or be affected by; that the difference between forces like Gravity and

Electromagnetism is solely the result of this dynamic occurring within

particles, along with the effect being compounded through entanglement.

So if anyone were to feel inclined to read the full manuscript that this

portion of the publication comes from, then they will be able to do so

through reading the book: Quasi-Dimensionality: An Accidental Blueprint

of the Infinite Improbability Field by the same author as this paper. It is on

Amazon.

a month later

One of the biggest error maybe of the sciences community is to consider these photons massless. It seems very odd in fact. Here is why. If the gravitational waves exist, and that these photons are mass less, so it is the motions and impulsions creating these waves contredicting so newton.

But what the thinkers confound is that this GR is for observations. If newton is right, so the gravitational waves are due to fact that these photons simply have a very small mass between probably 10^54 Kg and 10^70Kg and even possibly less still . The error so of theorists to find this Quantum gravitation is to consider the gravitons like the quanta of gravitational waves.

But in fact if the cold dark matter and these photons are encoded in this space vacuum of the DE and are 3D series of spheres , so the quantum gravitation appears simply in respecting the newtonian mechanic.

The main codes are in this DE and is anti gravitational and are too series of 3D spheres probably, and the number is preserved like the volumes of these series, so the densities become relevant. It is simply due to fact that the photons have more energy than mass and the cold dark matter it is the opposite. This reasoning explain this QG but too the antiparticles and a fith force appears. The motions, rotations, oscillations vibrations of these series finite of spheres having the same number than our cosmological serie explain the standard model and the emergent fields and mass of particles.

Einstein was not really false about his GR , it is just that we must improve it and it is just for observations at high velocities , the gravitation is not really a curvature of this space time, but it is true for our observations of this photonic spacetime.

If Newton is right at all scales, so these photons have a mass and also this quantum gravitation respect this newtonian mechanic, we cannot confound observations of a photonic space time with the newtonian mechanic at slow velocities. If they cannot quantify and renormalise this QG with the fields in trying to unify G c and h ,the QFT, the QM, the GR , there are reasons even with all the maths of lie groups or clifford algebras and even with the non commutativity or non associativity. It is simply because the fields and this GR are not the cause of our standard model. And that the gravitons are not the quanta of gravitational waves oscillating specifically inside these photons at this planck scale.

19 days later

The problem of gluons is very important when we consider this dark matter cold encoded in the space vacuum of the DE. The problem of mass is essential. Even in considering the higgs bosons and mechanism there is a problem of mass, that is why the higgs activate simply a part but does not explain the mass. In fact it is just that these higgs activate a part of the mass that this dark matter has encoded . It is like there is a difference between the mass of this DM at this comsological scale and the mass baryonic simply. If the gluons are linked with the photons encoded and this cold dark matter encoded, so that explains the 98 percent of mass that we need to explain this gluons problem. The problem actually is really the philosophy of this GR like the only one cause with the fields , it is not the case for me, they are just a part of problem, without adding this space vacuum possessing the main codes of this DE and the cold dark matter encoded, we cannot solve the problem.My intuitive equation permits to solve with the 3 spacetimes , E=m(c^2+Xl^2)+Y

    hello to YOU,...

    ""Mass, energy and movement of information are respectively dark matter, dark energy, and gravity.""

    Meissner's superconductivity effect serves as an important paradigm for the mechanism for generating mass M

    λ (M)= h/Mc

    In fact, this analogy is an abelian example of the Higgs mechanism, which generates the electroweak masses W ± and Z to gauge particles in high energy physics.

    The length λ (M) is identical to the London penetration depth in the theory of superconductivity.

    FIAT LUX,....hypothesis,...

    The initial state is a kaotic state of informational equilibrium where the surplus of information at equilibrium is translated into order therefore energy, light and sound .... AND MASS,.....this occurs following The Meissner effect (or Meissner-Ochsenfeld effect)

    Hi Olivier, I try to understand why we have this problem of mass of protons , the actual model explains 2 percent of the mass, the gluons must give 98percent of the mass to converge. I have so considered this DM encoded to explain it.

    I d like to know more about your general philosophy of this universe, what are these informations in their foundamentals, what are their origin and what are the particles and fields too ?

    "I d like to know more about your general philosophy of this universe, what are these informations in their foundamentals, what are their origin and what are the particles and fields too ? ""

    thx for your interest @Steve .. ;:-)

    Entropic information is capable of unifying all aspects of our universe at all scales within a coherent and global theoretical mathematical framework materialized by Entropic Information Theory and formulae

    information is the code and the code is what creates the process but is itself the process

    structure ordered by information storage a

    ""

    life as the storage of the information

    and the possibility to updated it by a process of self-learning

    to perpetuate that form of information

    ""

    to complete and following this defintion of life

    here is the entropic structure concept presented :

    - non alive entropic structure

    - alive entropic structure

    - thinking entropic structure

    - Knowing entropic structure (conscious one)

    human brain compute but not only

    some human brain process can not be computabe as the process of creation or intuition are not computable

    it s because creation process is out of algorithmic logic,... because AI will always stay in the algorithmic box created

    consciousness is information treated by subject,..

    it need subject and object

    not the information,

    because information can exist by itself

    as it is code

    and

    the code is what creates the process but is itself the process

    So Consciousness nor the universe can not be coded

    ///////

    ""Mass, energy and movement of information are respectively dark matter, dark energy, and gravity.""

    According to this new model, the fundamental physical vacuum at the smallest order of sub-matter, is a superfluid state of pairs of Kramer Weyl fermions constituting dark energy describable by a macroscopic wave function with small fluctuations of the background superfluid, obeying Lorentz symmetry, even if the superfluid itself is non-relativistic.

    In condensed matter, the Weyl point is a singular point of Berry curvature which can be considered as a "magnetic monopole" in momentum space [1], considered as neutral massless carriers of information with the nature itself of the information ( or -) corresponding to the helicity, circular polarization (clockwise and anti-clockwise). It is these smaller "granularities", the Weyl points, which are the basis of electromagnetic waves and condensed matter, as well as dark matter.

    The origin of the charge in the symmetrized equation of Maxwell [2] is in first instance Gravity which comes from the movement propagation of the monopole itself. The propagation of the monopole movement following specified orientation (sign in the symmetrized equation), is an eccentric propagation characterized by the logarithmic spiral based on the gold section [3] depending on the true nature of the process involved by the structure itself, loss of entropy: anticlockwise; gain of entropy: clockwise. Moreover, the ideal geometric container representation of the propagation geometry is a double torus; so here are presented the mathematics reasons of all nature phi and vortex structures at all scales of our universe explaining the physical causes underlying the complete gravitational process of our universe across all dimensional orders

    (5) (PDF) Entropic Information Theory. Available from: https://www.researchgate.net/publication/356378350_Entropic_Information_Theory [accessed Dec 02 2021].

    12 days later

    Zeeya and FQXI Community:

    A long time in the works, with many false starts unfortunately reported as "complete," and still, perhaps, a work in progress, the following webpage highlights work done in service of a "new" vision of macroscopic quantum entanglement, while paying respects to David Bohm and his vision. The forecasting section, if it is viable, owes its inspiration to Michael Talbot's book, "The Holographic Universe." If the forecasting algorithm is not viable, we still feel we may be on to something, and request the help of those willing in the FQXI Community to flush out, accept as complete and viable, or vote "Quash," with grounds for their vote. The website contains numerous forays into other topics, and the reader is, obviously, are free to explore.

    Thank you.

    -Deserdi Chapas

    DESERDI.xyz Website

    6 days later

    hello to all,...

    im here to present to you a alternative model of reality,that can be right...

    Entropic Information Theory: Bits from Bit

    as nature love OCKHAM razor, i can explain the simple bases of this model in this regard,...

    Information is code

    and

    code is what creates the process, it is itself the process.

    Mass, energy and movement of information are respectively dark matter, dark energy, and gravity.

    Entropic Information formulas are able to unifying all aspects of the universe following the common base element : the information.

    with this formulae

    ln(W) =( action)/h = tОЅ = mtcВІ/h = (k(b) T lnвЃЎ(2)t)/h = (m l(P)ВІ)/(h t(P)) =A/(4l(p)ВІ)= (m G)/(2ПЂcВіt(P) ) = (m G)/kcВі = 2ПЂRmc/в„Џ=4ПЂGmВІ/в„Џc = mв€љ(G/2ПЂch)

    general relativity and quantum mechanics are reconciled by introducing quantum gravity for Planckian scale.

    with that formualae вЃЎ 4ПЂGmВІ/в„Џc

    i have all the mathematics and demonstrations to go with

      if the quantum mechanics and the wave functions are predicted with probabilities of measurements and that inside this ensemble we have hidden variables, so that contredicts Bell, but it is total non sense considering different values of a hidden variable because it is under the actual measurements under our actual proofs about this QM.The distinction that they made with the ensembles cannot be actually found because if these hidden variables exist they are not under these actual wave functions . It is against th statistical analysis and the Bell theorem in violating the determinisn of our actual system of measurements. The hidden variable could exist but are not in our actual spectrum of analysis and is not about the wave function because these waves and their ensemble must respect the foundamentals.They can try to prove their ideas with a system of measurements, that will not give an answer because simply it is beyond all this actual system , its statistics and the wave functions. Their error is to affirm that it is measurable in an ensemble just because they have chosen to tell it in fact. They want so with an invention to differenciate the ensembles and so they want to create these hidden variables somewhere in the states of space, and Hilbert is not the problem, the statistics yes and probablities of measurements.

      It is not possible for the measurements , detectors or computations because we have simply these limitations .All their ideas are against the causality and locality under our actual foundamental parameters of observations, measurements and probabilities and statistics of our wave functions. They can tell all what they want with ensembles different and mathematical tools , they don t violate Bell simply.

      Charlie, Debbie, Alice, Bob or peter paul jacques shall not change the observations and measurements and it is not about the interpretations of each observer but about pure quantum rational results.

      in fact if ensembles exist with hidden variables, they are actually beyond our actual knowledges and they dont interact with the actual wave functions simply , so they cannot be measured, nor computed nor observed because it has nothing to do with our actual physics. Our physics are emergent from deeper logics yes but that does not contredict Bell in the sense that we have a system of measure superdeterministic with our actual tools simply

      if thinkers want with our actual system disproves Bell, so they must explain the cause and not with these wave functions.

      I consider hidden variables but not in the actual ensemble of wave functions, and furthermore it is probably stable hidden variables having acted before our measurements, so not possible.

      Our actual standard model is a result of evolution, we have these baryonic matters like a result of a cause and they have fields and waves , and so when we measure them we respect the superdeterminism and Bell , if we have hidden variables, so they are before this creation of this baryonic matter and we don t know how and where still. All this to tell that our actual baryonic matter and our actual wave functions are stable systems under specific superdeterministic waves, fields and properties , you cannot change them. Furthermore imagine we want to explain this consciousness with actual hidden local variables, it is not possible because 1 we have farer cause in our satandard model and secondly it does not come from fields of this GR .

      If my reasoning is correct about the spheres and the universe. We have 4 main finite systems , the photons made of series of spheres, the cold DM too , the DE spacevacuum for the main codes and informations and the baryonic matter due to the 3 others merging. And we have a fith probably system in 0D for the infnite eternal consciousness. The universe seems simple generally. The quantum series have probably the same number than our cosmological series of 3D spheres, oddly it approaches the dirac large number. Actually we observe this GR and a part of our standard model but we need to go farer and deeper about all this.

      A thing interesting is that if the universe has chosen these 3D spheres like primarty essence, we have a kind of superfluidity for the 3 ethers, it is interesting about the contact when we aply specific series and volumes. The waves and fields are emergent from the motions rotations oscillations of spheres 3D.

      The philosophy correlated is a little bit different than the strings, my works are an assumption but the strings too are an assumption, we need to know more, I believe that we could prove these spheres maybe with the ricci flow and the 2 other ethers and the poincare conjecture.

      The fact to superimpose the 5 systems with the spherical topological geometrical algebras can be very relevant, alone I cannot solve all, I know that the thinkers have difficulties to change their philosophy and that these strings are an institution and the GR alone too, but my humble reasoning can coverge and complete the puzzle.

      a month later

      STRUCTURE FORMATIONS OF BIOMOLECULES

      We study a Nano-Structural 4-dimensional New Atomic Model, mainly for biological elements on the basis of Super Unified Theorem SU(11)вЉѓSU(6)Г--SU(3)Г--SU(2)Г--U(1), that means, up-converting quantum masses or quantum dot's positions or points have been consider as four coordinates, such as x = v1t; y = v2t; z = v3t; and v4t, a pseudo co-ordinate due to optical holographic energies assuming the General Atomic Structure of Matter Atom influencing by exotic matter fluids and providing forces by new energy sources of SU(6) in the respective framework of SU(6)Г--U(1), where U(1) created magnetic monopole(once created never destroyed), hence there created a new kind of electromagnetic force with current within the GUT model of SU(5)вЉѓSU(3)Г--SU(2)Г--U(1). This new model is likely the extension of Rutherford's or Niels bohr's Atomic Model.

      In this new model assuming some new kinds of quantum particles in wave status created firstly from the exotic matter fluid by maintaining the entanglement of wave-wave duality. The formation of mass actually done by the resonating vibration of new energies (explained details in my published articles), resonating energies then appeared like as boxes or packets or bunches of high energy frequencies of short wave lengths. Hence found, new kind of different particles by quark-type but lepton-likes are tightly bindings by the bosons of the strong forces of SU(6), then other particles forming by exchanging bosons of SU(6) into bosons of SU(5). In GUT theory, we observed tightly binding by different quarks & gluons, forms different kinds of new particles of matter atoms, some of them not visible practically but confirmed at laboratory. We found then molecules, monomers then polymers etc. under the network of new kind of electromagnetic force. Again, light energies of incoming information or consciousness within the resonating boxes are the main indicator of life formation and thereafter of everything. This new kind of laser light like focussing beams appeared with different gradients forces, optimising its potentiality behaving holographic optical tweezers then trapping nano-particles. Therefore, encapsulated quantum dots and up-converting nano-particles, we found then different elements, compound elements etc. of ordinary matter atoms and continuing its resonating vibration "mode" for fabricating the structure formations of different complex bio-molecules like DNA's, Proteins etc. also then creating catalyst for accelerating the system thereafter.

      13 days later

      Relativistic Expanding Mass (REM) Model

      Overview

      by Ted R. King Jr

      rabbitwhole.com

      The Illusion of Gravity

      Another crackpot theory by an outsider? Einstein was a crackpot outsider--until he wasn't. Even if this model gets soundly falsified that would spare all of us from wasting any more time on it. So far this looks very unlikely. Exploration and refinement are called for by those who are qualified and enchanted by the possibilities.

      Here is an alternative interpretation of gravity that addresses some widely acknowledged fundamental flaws in General Relativity and Quantum Field Theory that prevent their unification.

      Like many in the physics community I strongly suspect the interpretations of our core experimental data need some fundamental adjustments. Here is an elegant, single postulate one that simplifies almost everything.

      REM Postulate:

      All mass is expanding, deforming the adjacent aether which is expanding enough slower to agree with General Relativity.

      Newton's inverse square formula, contrived to match Kepler's geometric data, appears to be an ad-hoc construction to interpret gravitational mass as generating spooky action at a distance exactly equalling the inertial mass defined by his brilliant 2nd law by figuring out a gravitational constant fudge factor, G, to make them match. He admitted he, like we, had no idea of the nature of gravity to justify this interpretation, but it stuck because in his day there were no other plausible explanations. An expanding Earth would never have occurred to his generation, only force could move things, and his gravity as force equation, along with his 3 famous laws has endured to this day as the foundation of physics for it's predictive power, but, as Newton admitted, not for illuminating our understanding of the nature of reality. To this day the prevailing opinion is that nobody really understands gravity.

      Einstein established beyond reasonable doubt that mass curves (actually deforms) space and time, and there is no such thing as gravity. Why have generations of top physicists, including Einstein himself, been beating this dead horse to no avail trying to quantize gravity to play nice with QFT?

      What if Newton's 2nd is all we need? What if all mass is inertial mass? What if we don't fall to Earth, but Earth expands to run into us at 9.8m/s2. What if we are just non-accelerating inertial mass with no measurable spooky force at a distance to push us down and the only measurable force is upward against our shoes? That seeds a massive rapid phase changed physics and unblocks the road ahead.

      Too outrageous? The only force when you drop an accelerometer is an upward g force on the dropper. Think about it. Is gravity the only exception to Newton's 2nd law?

      This is no more outrageous than Einstein's crackpot (until it wasn't) relativity theory. There is no limitation on how far inflation can inflate in what we currently hypothesize is an infinite, flat, fixed density, inflating universe. Only the REM version of inflation applies to all mass, which is just confined energy, even at the subatomic scale, and to a deformable by adjacent mass or accelerating charge in a resurrected Luminiferous Ether which, being deformable, also propagates light. Einstein's spacetime "curvature" with the REM interpretation should almost all still agree with accepted experimental data, with no need for impossible singularities or negative gravity. At last, a visualizable model not only for "gravity" but for relativity as well! This moves the quest to understand physical reality to the exploration of the nature of the aether and the genisis of particles therein. This is worth serious exploration.

      Einstein also showed us that mass in motion distorts spacetime, which can thus be treated as some kind of compressible "fabric" that propagates light, such that an observer in a compressed frame measures proper distance and time and thus light speed as c, and a less compressed observer "sees" the compressed rulers and time as shorter and slower. Thus the REM interpretation of the equivalence principle renders the laws of physics, except for now dethroned gravity, the same in any frame, accelerating or not. The only so far recognized evidence of this inflation is what we call "Gravity." Even unaccelerated mass in motion with respect to the local fabric would distort it accordingly and no observer could tell the difference any more than in Einstein's closed elevator model. That includes Michealson-Morley type experiments. Einstein's fabulous equations would still have mass telling spacetime how to "curve" and spacetime telling mass where to go, and the proper vacuum light speed would still be the same in any frame because rulers and clocks would accommodate to local compression.

      How can the REM model be tested?

      To be determined, of course, but the one obvious test is done every day all over the world when any object is allowed to free fall to Earth. It really is in free fall, "feeling" no 2nd law force (as would be measured by an accelerometer), and therefore no 3rd law opposition. Another major clue is that, neglecting air resistance, objects of different masses released from the same height hit the ground simultaneously. This is very strange since the other known forces described by QFT accelerate in their respective fields according to charge magnitude. Expecting the "gravity" field to act likewise fails as objects fall at the same rate regardless of mass.

      ALL mass is inertial mass in the REM model, which solves the mystery of why "gravitational" mass equaled inertial mass.

      Some Possible Implications to Explore

      Black holes without singularities, just expansion at light speed running into the Schwarzschild radius. No need for singularity driven Big Bang .

      Dark matter and dark energy explained by continuous creation which is required for constant density in an expanding universe.

      The expensive, fruitless search for the graviton is finally put out of its misery

      Deeper understanding of the "fabric" of space-time possibly leading to some parametric characterizations illuminating quantum theory and enabling first principle derivations of fundamental constants

      Refiguring Quantum Field Theory and the Standard Model

      It seems so far that ultimately all mass is confined light in a single field with "particles" formed by topological wave interferences in confined space.

      Faster then light warp drive by localizing and directing enormous bundles of energy

      The role and roll of consciousness in the nature of reality - think about it as you wiggle your pinky!

      We will never come close to understanding the nature of reality without delving deeply into the nature of consciousness. It is already clear that all human creations originated in human minds, and as yet no instance of the emergence of consciousness from matter has been reported. There is no valid reason to discard the idea that all of physical creation originates in consciousness, and not vice-versa. This accords with the most revered and enduring wisdom of the ages.

      Two hints: time and space are human creations! There are unlimited realms of consciousness with a new one created at every point of selecting one alternative over another. Think parallel universes and unimaginable, to us, forms of consciousness!

      There are more things in heaven and Earth than are dreamt of in our fledgling science.

      Ted R. King Jr.

      2-7-22

      23 days later

      Thinking about superdeterminism:

      For a model based on a non-associative algebra, might it be possible to ascribe the phenomenon of quantum measurement/collapse to its non-associativity. Between measurements particle trajectories would be deterministic, but, because handings and order of algebraic operations could differ for different points of observation, calculations of trajectories between events would be ambiguous. As a result, any attempt to predict events could only be probabilistic. Events would be loci for which calculations for all points of observation generate a consistent but not necessarily identical result, some latitude being possible subject to limits imposed by the Heisenberg uncertainty principle. Loci with inconsistent results would be occupied by ``quantum foam". A theory based on this approach may require application of the principles of chaos theory.

      This thought was prompted by the way in which the pattern of particles of the standard model can be found within the subalgebra structure of a non-associative quasi-group.

      See viXra.2203.0014v1

      15 days later

      Intergalactic ... Riemann

      Intergalactic Travel and Riemann Hypothesis - featuring Dark Matter, Dark Energy, Higgs Boson, Higgs Field, Electroweak Interaction, No Big Bang, Plus Other Physics and Mathematics

      Rodney Bartlett

      Abstract

      Using a scientific method that permits arrival at destinations in the universe to occur instantly is not only much more efficient but it avoids dangers like accumulated damage from years of exposure to micrometeorites and cosmic radiation. This article proposes such a method since it's based on an engineering experiment conducted at Yale University and reported in a science journal in 2009. It proposes a mathematical universe - again based on the work of well known scientists but extended into details of my own. A universe that is mathematical in its foundation is required since the gravitational-electromagnetic unification spoken of in connection with the Yale experiment is proposed to function topologically (using the Mobius strip and figure-8 Klein bottle). Admittedly, my proposal is based on technology which has not yet been developed - except for the Yale experiment - but since the first step in developing this technology may already have been taken 13 years ago, determined efforts should see its fulfillment in 50-100 years. And of course, the technology is not limited to travel within the solar system - say, to Jupiter's moon Europa or to Mars where instant travel would save astronauts and cosmonauts from many months of isolation/threats to muscles and bones/ radiation etc. It could also be used for interstellar and intergalactic exploration, and even for investigations into Earth's past and future if the potential for time travel is realized.

      Most people aren't accustomed to thinking that the universe is literally composed of mathematics (binary digits, Mobius strips, figure-8 Klein bottles, Wick rotation). I developed these ideas after reading about several professors - John Wheeler, Max Tegmark, Erik Verlinde, Ed Fredkin, and Rafael Sorkin. The Introduction will provide understanding that the Riemann hypothesis doesn't just apply to the distribution of prime numbers but can also apply to the fundamental structure of the mathematical universe's space-time. When applied to the universe, it explains the static universe, dark matter, dark energy, the Higgs boson/field, and aspects of particle physics like the electroweak interaction.

      Unifying gravitation and electromagnetism has this consequence: the electrical-engineering experiment at America's Yale University, together with the ideas of Albert Einstein, tells us how we could travel to other stars and galaxies. An electrical engineering team at Yale demonstrated that, on nanoscales, light can attract and repel itself like electric charges or magnets.

      "Tunable bipolar optical interactions between guided lightwaves" by Mo Li, W. H. P. Pernice & H. X. Tang, Nature Photonics 3, 464 - 468 (2009)

      This is the Optical Bonding Force. For 30 years until his death in 1955, Einstein worked on his Unified Field Theory with the aim of uniting electromagnetism (light is one form of this) and gravitation. Achievement of this means the quantum components (gravitons) of gravity/spacetime-warps between spaceships and stars could mimic the Optical Force and be attracted together, thereby eliminating distance (this, possibly acting in partnership with repulsion, could produce a wormhole, or shortcut between folds in space and time). If the gravitons are superposed and entangled, distances between both points in space and points in time are totally eliminated.

      8 days later

      https://www.deserdi.xyz has been revised, edited and updated. If the snippet below catches your interest, we welcome you to visit the website...

      Standard quantum entanglement is a property of two particles with the same genesis. Measurable aspects of two entangled particles - such as 'spin' - find themselves correlated in such a fashion that a change in one particle results in an instantaneous change in the other particle, even if they have moved vast distances apart since their conjoined birth. This influence appears to imply faster than the speed of light information transfer, although there have been some problems with capitalizing on this property in fields related to communication.

      Macrocosmic quantum entanglement, to take the idea even further, relates to phenomena which spring from the common birth of all matter and energy in the Big Bang. Some speculate that perceptions of synchronicity, as well as Bohm's postulated holographic properties of the Cosmos, all spring from standard entanglement models applied to the grand scale of a Universe with a single common 'entangler' like the Big Bang.

      Bohm spoke of the Universe as being a "Holomovement," in which every part contains the whole. I first saw a hologram in my high-school physics class, and was fascinated. At deserdi.xyz, our work starts with the attempt to mathematically unlock the universal hologram, in a pragmatic way which satisfies the scientific prerequisite: namely the prediction of future events from past events. Our approach should apply to all measurables, whether they be microscopic entities or larger scale phenomena.

      On our website, we aim to strip scientific forecasting to its bare bones, capitalizing on the holographic properties embedded in Fourier analysis, to achieve one of the great aims of science, prediction, using very few standard physical models. All we really retain is the memory of measurements made in the past and the projection of those measurements forward in time with the help of the French mathematician Joseph Fourier's work.

      Paradoxically, we will later discuss whether 'prediction' is even worth undertaking... for various reasons: some pragmatic and some spiritual. However, that said, it does seem that, since we humans are addictively dependent on our predictive efforts, we may be bound to them, for now. Perhaps, though, we could gradually reign in our obsession with both past and future forecasting to shorter and shorter time frames - it remains for humanity to undertake dialogue as to whether we would be spiritually well-advised to ween ourselves from the near universal obsession with prediction and control.

      9 days later

      Understanding the 'Area Law,' in regards to the black hole entropy, based on an underlying fundamental theory has been one of the goals pursued by all models of quantum gravity. In black hole thermodynamics, black hole entropy is a measure of uncertainty or lack of information about the actual internal configuration of the system. The Bekenstein bound corresponds to the interpretation in terms of bits of information of a given physical system down to the quantum level. However, at present, it is not known which microstates are counted by the entropy of black holes.

      Entropic Information theory claims that the new formulation of entropic information approach, based on the bit of information gives an explanation of information processes involved in calculating entropy on missing information from black holes as well as down to the quantum level.

      Entropic Information,approach founded on the bit of information such as the number of bits of the system,the number of bits necessary to specify the actual microscopic configuration among the total number of microstates allowed and thus characterize the macroscopic states of the system under consideration.

      where the entropy of a thermodynamic system in equilibrium measure of the uncertainty as to which all its internal configurations compatible with its macroscopic thermodynamic parameters (temperature, pressure, etc.) are actually realized.

      Here is some Introduction to mathematical black hole entropic information formulae [1] and to new entropic information definition [2]; [3]

      [1] S_BH= (kcツウtln(2))/16マ\ツイGM

      [2] K ツイ( T ln竅。(2)t)/h

      [3] m cツイ(Kln竅。(2)t)/h

      here a short presentation about entropic information theory (EIT):

      https://www.youtube.com/watch?v=M8cQ8aEOaBk

      and here is link to the publication,....

      https://www.journalpsij.com/index.php/PSIJ/article/download/30304/56856/

      2 months later

      The universe is 42 trillion years old. Here a summary of my research:

      I quantize space. My quantization uses spheres. Variable size (energy dependent) complex number 2-spheres. Every particle is also a quantum of space. I call this SPACE PARTICLE DUALISM (SPD), and so my approach to quantum gravity is called space particle dualism theory.

      I describe gravity with path abundance differences in the quantum vacuum. When you have overlapping spheres and paths through space always are on the surface of spheres, then regions of space with more of these space atoms (elementary spaces/surfaces) will have more pathways leading through them, and thus things are attracted to such regions.

      This leads to a very different cosmology. One in which Newton was right when he said that a universe with no center and no edges cannot collapse due to gravity.

      The cosmology of my quantum gravity theory is called ENTROPIC EXPANSION. It states that the entropy density of the universe is a constant. It never changes. So the universe expands in order to compensate the entropy increase inside supermassive black holes.

      According to SPD/EE, the post-decoupling universe is exactly 42 trillion years old, while the total age including the pre-decoupling time is 86 trillion years. I have tested these two figures in many ways:

      1. Using 8,846 supernovae to show that the time-redshift/distance-redshift equation of SPD/EE leads to supernova luminosities that decease perfectly with the square of the distance. I also demonstrated that mainstream cosmology totally fails at this.

      2. I calculated the age of 35,000 white dwarfs. It turned out that 1% of them were within the GR-age of the universe, and the rest was exceeding it. At the same time 100% of white dwarfs were within the SPD-age of the universe, of 42 trillion years. I did not account for the shedding of mass in this calculation, because it is mostly the lightest WD which are the oldest.

      3. I have calculated that if we assume a universe that is 42 trillion years old, then all dark matter is simply burned out stars and steller black holes. My calculation leads to 90%-94% dark matter for the Milky Way.

      4. I found 16 stars that have invisible companions. For 12 of them I was able to calculate the most likely temperature of the invisible companion. Those temperatures were extremely low. 10 out of those 12 I calculated can be classified as end stage black dwarfs, because they have temperatures of less than 1,000 Kelvin.

      The fact that all temperatures I calculated were below the maximal possible temperature that still defies optical observation shows that my method of calculation is sound.

      5. I calculated that the initial temperature of the universe was 6,000 Kelvin, and that leads to the right size for BAO (Baryonic Acoustic Oscillations).

      6. I used the chemical abundance of heavy elements to estimate the age of the universe, and that brought me again into the order of magnitude of 10^13 years.

      7. I explained the black hole-bulk ratio by using the physics of exotic matter which I theorize to be the basis of primordial black holes/primordial exotic holes. This type of EM has positive gravity. There is no repelling gravity in SPD.

      I made a video in which I calculate the age of the universe:

      https://www.youtube.com/watch?v=MqsW_uJGSSk&t=822s

      My academia.edu profile is here:

      https://polyu.academia.edu/SkyDarmos

      a month later

      Hello everyone, my name is Trevor Johnson and I am looking for serious feedback on an alternative thoery of reality that I wrote a couple years ago. My paper examines consciousness as a fundamental building block of reality.

      v4

      12 days later

      DNA is the origin of the universe. We have 3 distinct brains, Left hemisphere, Right, and subthalamus Ontological. Our consciousness is anchored in our left brain, our right brain & lower brain operate on genetic directive, our conscious creative intention reflect back off the genetic program as thought/ideas from our right brain, with "sarcastic reflections" on the offsets. Free will, vs. determination. Our genetic directed brains, particularly, "Granny" ontological brain/persons is intimately entangled with the universe. DNA is the source of the information of a holographic (collective hallucination) universe. Evolution is by Roommate Agreements. John Lennon -- 'A dream you dream alone is only a dream. A dream you dream together is reality.' Talent expresses instinct. This should be reflected in all our media. These are theories not speculations. You can read more and evaluate the evidences at Peterborough Meetup, Philosophy of Mind group. It is all free and online. Any questions or comments on this at this juncture is a waste of time. This is big, and a game changer.

      I read my posts from the past on FQXi, my english which is not of course perfect was not very good and my theory was at its begining. I have learnt a lot during these years and improved my theory of spherisation, this optimisation evolution of the universal sphere or future sphere with nthese quantum and comsological 3D Spheres. My equation general about the mass energy has been imporved too. I have like I explained considered this DE and this CDM important and series of 3D spheres for the particles of photons, DE and CDM and it is when they merge that they create this ordinary matter . The scalar massless fields of this DE antigravitation possess the main informations and encode these photons and this CDM to create the diversity of matters and their properties. A fifth force appears and the QG is a pure newtonian mechanic where we have different gravitational newtonian fields in our standard model. we are arrived at a new era for our physics, we have this newtonian mechanics and this GR like interpretation of the gravitation. Einstein has improved the works of Newton, they are both relevant in their referentials of analysis. Now we must complete their works in considering these new paraneters, this DM and this DE. It is not that we must modify the newtonian mechanics or modify this general relativity , we must simply add these scalar new fields and particles of this DM and DE and at all scales.

      The fact that this DE is a negative pressure antigravitational is relevanmt for the evolution of the universe considering the state equation.And it is relevant to consider these couplings of the DE in our standard model because this antigravitationa, this fith force is fascinating and permit to balance the actuaql gravitational system and the electromagnetic forces. The scalar massless fields of this DE possessing the main codes permit to explain furthermore this evolution and the diversity of atoms, the chemistry, the biology ....the actual standard model cannot explain this .It lacks pieces to add to this puzzle. The massive scalar fields of this cold dark matter too seem important.

      I repeat but if the 3 systems free cosmologically speaking are essential and that this DE possesses these main informations permiting to create this ordinary matter in encoding the photons which are quanta of electromagnetism, thermodynamics and that the cold dark matter are mainly quanta of mass permitting even a pure newtonian mechanics, that becomes clear.

      The real interest is to find their couplings in our standard model , these particles exist. It seems that all is under a kind of antigravitation gravitation balance , and the actual known 3 forces of our standard model are balanced too with these scalar massless fields of this DE. That becomes clear in considering that the photons create simply the 3 actual forces due to simply a number of photons encoded.

      The higgs mechanism seems to be due to this logic , the scalar massless fields of the DE and the massive scalar fields of the DM permit to give a mass to these W and Zphotons,we have the same kind of mechanism for the QCD and the missing mass is explained with kinds of axions giving the mass from this dark matter under an antigravitational field of this DE.

      My equation is about all this , it is an equation which can be probably improved but the generality is there E=m(c²+Xl²)+Y=2 mc², it is important philosophically speaking about how is created the matter from the mass and the energy, the energy of these photons doesn t become a mass like this you know with specific fields and oscillations of this GR with scalar fields of this GR, we need the 3 systems to create a baryonic ordinary mass and matter.

      If I am right in all humility it is really revolutionary philosophically, mathematically and physically speaking because it is mainly about how is created the matters ordinary and what is the philosophical origin of this universe.

      The main incredible thing in this reasoning about the DE possessing the main informations and being not invarients is the complexity of these informations, these scalar fields are not the same for all the particles of DE, and it is intriguing ontologically and philosophically.

      The universe is like a yin yang balanced system in evolution with small differences permitting the motions in this fluidity and the 3 spacetimes, the gravitation and the antigravitation seem really the 2 main chielfs orchestra and these photons permit of course the actual 3 known forces in increasing simply this balance gravitation antigravitation in forces. We have a fifth force and this force is more than fascinating considering its complexity of codes ....a new entropy appears a new theory of informations.

      Einstein and Newton were right in their referentials, now we can go farer in completing their works about deeper referentials.

      This equation intuitive that I have improved during these years can be revolutionary if I am right, physically and philosophically speaking, E=m(c²+Xl²)+Y=2 mc²