macca thanks for sharing, I liked a lot because you give a different angle , I try to do the same wih my theory of spherisation. It is important I think to give other angles. They need the physics community different angles , because if not they turn in round in having tried all and after they don t improve or find. The Spheres I utilise in the spherical topological geometric algebras is to give them an oher angle than the strings, photons and GR. This strings is an institution and they consider a mathematical accident or a god , so after they consider a possible ether or strings in 1d at this planck scale connected wih a 1d cosmic field of this GR like if God utilised the informations in these photons with these oscillating strings. We need different angles than this philosophical prison of strings and GR like primary essence at my humble opinion . It is what you give with the first-principles Lagrangian field theory,

congrats for your approach and this different angle given to the community.

6 days later

well it was certainly nice to hear you enjoyed reading my work, and i would very much look forward to reading yours as good manners suggests i do. be foreworned i state my beliefs firmly but never with condescension or closed mindedness, and always with an earnest and true nature, towards finding the best understanding possible. you have to question things in science as you correctly summosed as it would a family tree without new genetic material to bolster it and give needed biodiversity. and i also think if a scientists can tell you what his experiments gonna do he not a very good scientist. anyway i certainly give it a look for , but dont keep me hanging too long a straight up copy of what you purport, and see how cool your axioms could be, so dont forget about me i really wanna read your ideas.

KM

KM Hey Stuart,
Thanks for taking the time to engage — genuinely appreciated. You’ve raised exactly the kind of questions I want people to ask, and you're spot on to press for clarity.

What I’m trying to fix (or possibly un-break) is the lack of a physical, mechanical basis for things like gravity, quantization, and EM wave behavior. I’m working from the idea that space isn’t empty — that it’s filled with a compressible medium ("Luxia") and that pressure and torsion in this field can produce all the behaviors we currently describe using abstract frameworks.

So rather than layering a new dimension onto GR or QFT, I’m offering a different angle altogether — one that starts from first principles:

Mass displaces the medium → pressure gradients → gravity
Torsional oscillations in the medium → light and EM
Threshold excitations in field structures → quantization
As for specific reformulations:

I’ve replaced the spacetime curvature model of gravity with a pressure-gradient equation:
a = -grad(P)/rho
I’ve rederived the wave equation from mechanical field terms:
grad2(phi) - (1/c2) * d2phi/dt2 = 0
I’ve constructed a unified Lagrangian that describes both scalar (gravitational) and torsional (EM) field behavior
And there are real-world implications already:

I’ve used orbital drift data (e.g. from the Moon) to derive a gravitational propagation speed well above c, consistent with a compressible medium, not curved spacetime
I’ve shown how time dilation can be modeled as field compression, not just relativistic clocks
The framework also opens up doors to energy extraction, propulsion, and fusion under new field dynamics
So yes — the ultimate goal is to reformulate the major equations of physics under this unified mechanical lens. It's early-stage but grounded in concrete logic and empirical observations.

Haven’t heard back from Synthese yet — though I suspect this kind of thing either excites or terrifies editorial boards. Either way, I’m not waiting for permission to keep building.

Appreciate the challenge and would be glad to send over a more concise PDF if you’re curious to see where it’s going.

Best,
Macca

KM

KM are you talking to me ? i didnt invent a new dimension i unified physics, gravity and the quantum realm. and was it needed? hell yeah it was needed and i did all mechanically.

Request for analysis of an innovative article on the Theory of Everything
Dear all,
I hereby kindly request the analysis of an article authored by me, which proposes an innovative approach in the field of theoretical physics, more specifically in the development of a possible Theory of Everything — a structure that aims to unify the main pillars of modern physics: quantum mechanics and general relativity.
Link to the articles: https://sciprofiles.com/user/publications/3220836
You must read the following articles in sequence: first “Theory of Obligatory Necessity: Elements and Facts Influenced by the Intensity of the Specific Physical Concept”, then “The Equations and Their Effects” and finally “The Information Promoted by the Uneven Distribution of Elements in the Universe” in order to understand the unification.
The work presents original concepts that can contribute significantly to the advancement of the understanding of the fundamental laws of the universe. Given the interdisciplinary nature and theoretical boldness of the proposal, I believe that a careful and critical reading could offer significant insights, both for improving the text and for the broader scientific debate.
I am available to provide any additional information, as well as the data and foundations used in the construction of the theory.
I would like to thank you in advance for your attention and time.
Sincerely, Carlos Eduardo Ramos Cardoso

Hi all,

For those exploring entropy, recursion, or motion-based collapse, I’d like to share two formally published works that define a structural physics model using directional motion (Δm) instead of time: Motion Based Physics.
Motion-Based Physics is a structural framework that redefines system survival, entropy, and collapse through recursive directional motion. It is not symbolic in the literary sense, it is a post-classical physics model rooted in motion integrity rather than observation-based timelines or scalar time.

Entropy Collapses in Motion
DOI: https://doi.org/10.5281/zenodo.15661015

Latnex: Motion-Based Structural Mathematics
DOI: https://doi.org/10.5281/zenodo.15620561

The framework introduces a collapse condition based on recursive motion thresholds (ΣΔm, ΔΔm ≥ Ct), with entropy defined as a failure of sustained directional motion. It operates outside the jurisdiction of the Second Law of Thermodynamics; entropy does not govern systems where motion persists and recursion holds. This is not a metaphor; it models survival and collapse across physical, cognitive, and engineered systems.

This work is formally copyrighted.
My works were introduced on April 10th on Archive and on April 16th on Figshare.

Michael Aaron Cody

17 days later

Hello everyone! Here is a link to a draft exploring how a discrete BCC lattice of binary mass quanta might underlie the Standard-Model fermion mass hierarchy, emergent quantum field dynamics, and an effective Einstein–Hilbert action. I’d be extremely grateful for any constructive feedback on the core assumptions, mathematical clarity, or potential phenomenological tests. Thank you in advance for taking the time to read and comment!

(Working Draft) A Discrete BCC Lattice Framework for Fermion Masses, Quantum Fields, and Gravity

A Tested Curvature-Based Gravitational Model with Falsifiable Results

Hello FQxI team and community,

Following up on a recent email exchange, I’m publicly sharing the working equation of a tested gravitational model that produces accurate predictions without requiring dark matter, dark energy, or empirical curve-fitting.

This is not a speculative framework. It is a falsifiable, implemented model with results aligned directly against astrophysical observation.

🔹 Core Equation (QIR Engine):

ΔX=πMaDbIc(1+log(1+MDI))11+NΔX\Delta X = \pi \cdot \frac{M^a D^b I}{c (1 + \log(1 + MDI))} \cdot \frac{1}{1 + N \Delta X}

Where:

  • M: baryonic mass of the system
  • D: distance (in kpc or Mpc as needed per scale)
  • I: phase-aligned information density (entropy per unit area, empirically derived)
  • ▵X: observable deviation — lensing angle, orbital curve, curvature offset
  • Constants: a = 1.876, b = 0.389, c = 0.475, N = 0.0000932

This model does not use free parameters per object. All constants are global, derived from saturation behavior in recursive tests.

🔹 Empirical Behavior:

  • Matches galactic rotation curves using only visible baryonic input
  • Reproduces observed strong lensing arcs with residuals < 0.0005 arcsec
  • Generates cosmic acceleration without Λ or inflation
  • Resolves black hole information paradox through harmonic closure
  • Replaces the Big Bang with a curvature-saturation bounce
  • ADM-compatible, derived from a reformulated Einstein-Hilbert action

🔹 Access and Reproducibility:


I’m sharing this here for open critique, challenge, or collaboration. The model is deterministic, falsifiable, and built entirely from observable terms. If it's broken, I want to know. If it holds, I'm ready to build.

Christopher P. B. Smolen
axviam@proton.me

Hello everyone,
My name is Abdallah Ahmed Salem, an independent researcher from Egypt, and I would like to share a speculative but testable quantum model I’ve been developing: Quantum Resonant Energy Amplification (QREA)

The idea explores whether it is possible to extract directed energy or influence the quantum path of a particle (like an electron or photon) through a sequence of weak measurements. By introducing a weak-guided path mechanism — inspired by delayed-choice and weak value amplification concepts — the model suggests that under specific resonance conditions, energy or information could be cumulatively redirected without full collapse of the wavefunction.

The project is based on two pillars:

  1. Directional quantum guidance via weak interaction** — where the wavefunction "steers" without collapsing.
  2. Energy amplification through interference and information extraction** — resembling resonant tunneling effects, but from measurement theory.

I’ve already implemented several numerical simulations which show intriguing behavior: a particle subjected to controlled weak guidance appears to accumulate path deviation or energy concentration. The goal is to transition this idea into a real lab experiment with single photons or electrons.

I believe this model might offer a new way to understand quantum back-action and energy-information transfer during weak measurement regimes.

I’d be grateful for your comments, questions, or constructive criticism — especially regarding feasibility, related literature, or potential experimental paths.

Thank you!
Abdallah Ahmed Salem
mostemphy@gmail.com

Robert McEachern
Hi Rob,

Are you entering an essay in this year’s anonymous FQxI essay competition (https://qspace.fqxi.org/competitions/introduction )? I probably won’t, because of time pressures.

But re your view of how the world works: how come your proposed system knows itself, and how come your proposed system moves itself?

(The same question would apply to all the above people, with their “Alternative Models of Reality”.)

Let’s face it: numbers, e.g. zeroes and ones, CAN NEVER BE information, UNTIL the numbers have a category, and UNTIL something knows about these categories and numbers.

    Lorraine Ford
    I am not planning on entering the contest.

    In my view, "In the Beginning", there is no knowledge, movement or numbers, as you and physicists conceive of such things. And there are only two categories: (1) something or (2) nothing.

    If there is nothing, then there is nothing to either talk about, or to do the talking; so end of story.
    But if there is something, then that something has just two categories; (1) something capable of detecting the mere existence of something else and (2) something that cannot detect the existence of something else.

    The second category is not the cosmos we exist in; so again, end of story.

    That leaves just one category; a cosmos in which something can, at least occasionally, succeed at detecting, the mere existence, of something else. How?

    If that something, "In the Beginning", consists entirely of just "noise" AKA chaotically behaving "something", how can any order or deterministic "something" possibly ever arise/emerge, not quite out of "nothing", but out of the only stuff that actually exists - chaotically behaving "something"?

    Almost 80 years ago, Claude Shannon discovered an amazing, sufficient (but not necessary) condition, for that to occur. And that is what neither you, nor the physicists have ever even begun to understand; "noise" can merely use itself, as a "fingerprint", to detect the mere existence of something else, something similar to itself, via matched filtering (fingerprinting), which thereby enables deterministic "cause and effect" itself, as an observable phenomenon, to emerge out of pure chaos. Counterintuitive? Yes. Magic? No. End of story? No. Rather, it is the beginning of the story. Our story. The story of all of reality - and how it is able to emerge, not quite out of "nothing", but out of "something" that is just chaos. Want an example of this? Just think of one "noise-like" strand of DNA, matching (fingerprinting) another. Then apply that same principle to a much more elementary system; two "entangled" quantum particles. And Lo and Behold, you get the Heisenberg Uncertainty Principle, "spooky" Bell correlations, and everything else.
    Follow this link, for some further insights, regarding the above
    You will have to scroll down through the comments, to find mine, posted on Saturday, September 25, 2021 8:47am.

      After reading the paper on the origin of space and time, I found myself strongly resonating with the author’s main thesis. The paper emphasizes that spacetime is not a priori background to the universe, but rather an emergent phenomenon arising from deeper physical structures. This is highly consistent with the core view of my Vibrational Vacuity Unified Field Theory: space and time are, in essence, the result of specific distributions and dynamic tunings of the etheric (holographic information) field under certain conditions, rather than independently existing entities.

      Regarding space, I see it as a mapping of the etheric field’s information density, fundamental frequency, spin, and other multidimensional parameters onto the physical world. The dimensions and structure of space are entirely determined by how the holographic information of the etheric field is organized and tuned. The paper’s proposal that space can emerge from more fundamental structures directly supports my perspective.

      As for time, I have always maintained that it is the evolutionary trajectory of the etheric field’s fundamental frequency—a path length in the parameter space of the holographic information field. The paper’s emphasis on time as an emergent product of quantum structure and information flow aligns perfectly with my understanding that “time is a measure of changes in energy and information states.”

      The paper also discusses various theories—such as string theory, quantum gravity, and causal dynamical triangulation—that attempt to explain the emergence of spacetime from more fundamental quantum structures or information units. I would go further and point out that the emergence of spacetime is actually the result of the self-organization of the etheric field’s multidimensional parameters and holographic feedback, highlighting the roles of holography and dynamic tuning.

      In addition, I would like to add a few points. First, space and time are not only emergent from the information field, but also possess holographic and nonlocal properties. Any local change in information can, through holographic mechanisms, influence the overall structure of spacetime—this provides a physical basis for explaining phenomena such as quantum entanglement and nonlocal correlations. Furthermore, I believe that the observer’s consciousness itself acts as a higher-order resonance interface of the information field, capable of participating in and even influencing the process of spacetime emergence. This viewpoint offers a new explanatory angle for the relationship between subjective experience, measurement, and physical reality. Moreover, the structure and evolution of spacetime are not static, but are continually adjusted as the parameters of the etheric field dynamically change. Under extreme conditions, spacetime can even “dissolve,” “increase in dimensionality,” or “reorganize,” which provides a theoretical foundation for understanding extreme phenomena such as the early universe, black holes, and singularities.

      I believe that the strength of this kind of theory lies in its ability to describe space, time, matter, energy, and consciousness within a unified framework of the etheric field’s holographic information. The relevant formulas can be linked with observational data such as the cosmic microwave background, redshift, and gravitational waves, allowing for testable physical predictions. It also enables dialogue with mainstream theories like string theory and quantum gravity, enriching the mathematical and physical content.

      In summary, I see the origin of space and time as the emergent result of the etheric field’s holographic information structure and energy flow. This view not only aligns closely with the “spacetime emergence” concept proposed in the paper, but also further emphasizes the importance of holography, dynamic tuning, nonlocal feedback, and the participatory role of observer consciousness, providing a more complete and profound theoretical foundation for understanding the nature, structure, and evolution of the universe.

      Robert McEachern
      Rob,

      You say that there are 2 original categories: something and nothing. But in order to have a mathematical system, you need 3 things: categories, relationships between the categories, and numbers that apply to the categories. Where are the relationships coming from, and where are the numbers coming from? Where is the mathematical or logical proof that these missing basic mathematical aspects, i.e. relationships and numbers, can “emerge out of pure chaos”?

      (And where is this chaos coming from anyway? Because mathematically, the both the thing labelled “chaos”, and the thing labelled “order”, emerge out of underlying, genuine, mathematical and algorithmic order. There is no such thing as order out of chaos: there is only the superficial appearance of order emerging out of pre-existing underlying genuine mathematical and algorithmic order.)

      Also, how does the mathematical system know itself? I.e. how come the system has the ability to detect/ distinguish something from nothing; the ability to detect/ distinguish one category from another category, to detect/ distinguish one relationship from another relationship, and the ability to detect/ distinguish one number from another number? As you have acknowledged, a system can’t exist without the ability to detect/ distinguish. I.e. a system can’t exist without a basic type of awareness/ consciousness of itself.

      Another issue is: once you’ve got your categories and relationships, why are the numbers moving at all (whether “chaotically” or non “chaotically”)? But it is not just numbers: these numbers apply to the categories. I.e. some aspect of the mathematical system is not only jumping the numbers, but this aspect of the mathematical system is also assigning the numbers to the categories. If the world ever moves in any way, or continues to move in any way, then there needs to be an aspect of the system that causes number movement, and number movement is actually quite a complicated thing because it involves both categories and numbers.

        Robert McEachern
        Yes, it is true that "Reality does not need a mathematical system".

        But like it or not, we continue to symbolically represent the low-level world mathematically because, when the world is measured, and when the numbers that apply to the measured categories are analysed, relationships have been found to exist between the measurable categories. Our only way of trying to understand the low-level world is to talk in terms of these categories, relationships and numbers, and in terms of a mathematical system. However, these categories, relationships and numbers are merely the way we need to think about, and symbolically represent, what actually exists.

        Also, I'm saying that we can logically conclude that it IS necessary for this low-level mathematical system to be able to detect/ distinguish (what we would symbolically represent as) its own actual categories, relationships and numbers from the very large number of possible categories, relationships and numbers that could theoretically, potentially exist. In other words, low-level reality/ the low-level world needs to be able to detect "what is true", and to distinguish what is true from what is not true.

          Lorraine Ford

          "But like it or not, we continue to..."

          Only you and the physicists, continue to ignorantly do that...

          Communications engineers ceased doing that, generations ago; and thereby changed our world, forever...

          Because, 80 years ago, Shannon proved, that when you no longer bother with even trying to "symbolically represent the low-level world" with measurable numbers, and instead, represent it with totally unmeasurable, but PERFECTLY detectable, long sequences of random, white noise, then there is no longer any "measurement problem" whatsoever, no longer any uncertainty in any measured quantity, and no significant possibility of ever making an error in any Yes/NO decision, about whether or not, the sequences being looked for, were, or were not, successfully detected, at the exact location of the detector.

          It is only after you have detected those unmeasurable, noise-encoded symbols, with no errors whatsoever, and only then, that you can start to use mathematics and numbers, without having to worry about all the "measurement" errors, trashing all your subsequent calculations, and thereby inducing generations of quantum physicists to propose all sorts of absurd "interpretations", of their trashed results.

          Mother Nature appears to have discovered this amazing "trick", eons before Shannon ever did. And that is what made it possible for, deterministic "cause and effect" to ever emerge, from an otherwise chaotic environment.

            A Thought Experiment: Is Belief Structurally Embedded in Reality?

            While writing my book, I kept circling one question: Is the double-slit experiment hinting at something deeper—beyond observation? What if belief itself structurally affects reality—even down to the quantum level?

            I’m not a physicist. I’m just someone who’s spent a lifetime noticing patterns, questioning anomalies, and holding onto questions nobody seemed to have answers for. With help from generative algorithms to assist with math formatting (I haven’t done serious math since tutoring it in college), I developed a conceptual framework I’ve named the Quantum Expectation Collapse Model (QECM).

            This theory proposes that wavefunction collapse isn’t just triggered by observation—it’s modulated by belief, emotional resonance, and expectation. It attempts to bridge quantum behavior with our day-to-day experience of reality.

            🧠 Quantum Expectation Collapse Model (QECM)

            A Belief-Driven Framework of Observer-Modulated Reality

            By Jeremy Broaddus

            Core Concepts

            • Observer Resonance Field (ORF): Hypothetical field generated by consciousness, encoding belief/emotion/memory. Influences collapse behavior.

            • Expectation Collapse Vector (ECV): Directional force of emotional certainty and belief. Strong ECV boosts fidelity of expected outcomes.

            • Fingerprint Collapse Matrix (FCM): Individual’s resonance signature—belief structure, emotional tone, memory patterns—all guiding collapse results.

            • Millisecond Branching Hypothesis: Reality forks at ultra-fast scales during expectation collisions, generating parallel experiences below perceptual threshold.

            • Macro-Scale Conflict Collapse: Massive ideological clashes (e.g., war) create timeline turbulence, leaving trauma echoes and historical loop distortion.

            Mathematical Framework (Conceptual)
            Let:

            • Ψ(x,t)\Psi(x,t) = standard wavefunction

            • ϕ\phi = potential eigenstate

            • Fi\mathcal{F}_i = observer fingerprint matrix

            • E(Fi)\mathcal{E}(\mathcal{F}_i) = maps fingerprint to expectation amplitude

            • α\alpha = coefficient modulating collapse sensitivity to expectation

            Then:

            Pcollapse=ϕΨ2[1+αE(Fi)]P_{\text{collapse}} = |\langle \phi | \Psi \rangle|^2 \cdot \left[1 + \alpha \cdot \mathcal{E}(\mathcal{F}_i)\right]

            Interpretation: Collapse probability increases when observer’s belief/resonance aligns with the measured outcome.

            Time micro-fracturing:
            tn=t0+nδtwhereδt1012,st_n = t_0 + n \cdot \delta t \quad \text{where} \quad \delta t \approx 10^{-12} , \text{s}

            During high-belief collision:

            ΨnΨn,A,Ψn,B\Psi_n \rightarrow \Psi_{n,A}, \Psi_{n,B}

            Each path retroactively generates coherent causal memory per branch.

            Conflict collapse field:
            C=i=1NE(Fi)\mathcal{C} = \sum_{i=1}^{N} \mathcal{E}(\mathcal{F}_i)

            (i.e. the total “expectation force” of all (N) observers, found by summing each observer’s expectation amplitude.)

            Timeline stability:

            S=11+βCS = \frac{1}{1 + \beta \cdot |\mathcal{C}|}

            Higher C\mathcal{C} = more timeline turbulence = trauma echo = historical distortion

            Experimental Proposals

            • Measure quantum interference under varying levels of observer certainty, simple rubber band breaking test vs youngs modulus, have users buzz in real time when they expect it to snap compare to when its expected to snap based on the modulus result. for best results offer a prize for closest to buzz in before it snaps for inventive. Can be done with any smartphone and rubberband by yourself even. use mic app to record sound of it breaking and a simple buzzer timestamp app.

            • Explore collapse modulation via synchronized belief (ritual, chant, intent)

            • Examine déjà vu/dream anomalies as branch echo markers

            • Investigate emotional healing as expectation vector realignment

            Closing Thought
            Expectation isn’t bias. It’s architecture.

            Destiny isn’t predestination—it’s resonance alignment.

            The strange consistency of the double-slit experiment across centuries may be trying to tell us something profound. In 1801, waves were expected—and seen. In the 1920s, particles were expected—and seen. Maybe reality responds not just to instruments… but to the consciousness behind them.

            Would love to know what actual physicists think. Tear it apart, build on it, remix it—I’m just here chasing clarity.

            Notes

            \mathcal{C} = … (calligraphic C, our notation for the total expectation “force” of all observers)

            so when using \mathcal{C} = \sum{i=1}^{N} \mathcal{E}(\mathcal{F}i)

            is simply our way of adding up everyone’s “expectation amplitude” to get a single measure of total belief-tension (or “conflict field”) in a system of (N) observers. Here’s the breakdown:

            • (\mathcal{F}_i)

            – the Fingerprint Matrix for observer (i): encodes their unique mix of beliefs, emotions, memory biases, etc.

            • (\mathcal{E}(\mathcal{F}_i))

            – a real-valued function that reads that fingerprint and spits out an Expectation Collapse Vector (ECV), essentially “how strongly observer (i) expects a particular outcome.”

            • (\sum_{i=1}^{N})

            – adds those expectation amplitudes for all (N) observers in the scene.

            So

            [ \mathcal{C} ;=; \mathcal{E}(\mathcal{F}1);+;\mathcal{E}(\mathcal{F}2);+;\dots;+;\mathcal{E}(\mathcal{F}_N) ] is just saying “take everyone’s bias-strength number and sum it.”

            We then feed (\mathcal{C}) into our timeline-stability formula

            [ S = \frac{1}{1 + \beta,|\mathcal{C}|} ] so that higher total tension ((|\mathcal{C}|)) → lower stability → more “timeline turbulence” or conflict residue.

            In short—(\mathcal{C}) is the aggregate expectation “force” of a group, and by summing each person’s (\mathcal{E}(\mathcal{F}_i)) we get a single scalar that drives the rest of the model’s macro-scale behavior.

            — Jeremy B