Karen,

No misunderstanding. Surely it is not true that "For most people (except when on psychedelics) it is nigh impossible to imagine anything without picturing it 'in' space and time." Mathematicians do it routinely, straight and high, and describe it besides. You must not have studied topology.

That quote is from The Meaning of Relativity, Princeton paperback, fifth edition, 1956 p.55, and completely in context. In fact, how dare you. Einstein based general relativity on Mach's principle (a term he coined) and couldn't make it work because there are no isolated systems (which would constitute a background, in your terms).

Addendum:

I just noticed that Bollinger defended entanglement-based quantum technology by hinting to successful protection of secrets. I wonder why there isn't yet also success with computers. In Eastern Germany I witnessed sometimes a bad habit to hide poor work by declaring it top secret. Military and secret services tend to need justification for demanding much money.

Did you in the meantime have a look to the essays I mentioned and to mine?

Dear Karen,

thanks for the clarification---I think I had misunderstood your endeavor a little: your project is more sociological (working out what current physics does consider to be fundamental) than normative (working out what it should consider fundamental) in nature. I think I like this take better---in a sense, what we're looking for limits what we can find, so if there is some sort of systematic bias in the theories we might accept as fundamental that precludes us from formulating these theories (which is just what seems to be happening with the Higgs mass' failure to be 'natural' at the moment), we should be ready to face these biases and, if possible, remove them.

Although, as I think Putnam remarked when contemplating the many worlds interpretation of quantum mechanics, what good is a metaphysics one can't believe? What if the world is such that if we ever were told its fundamental nature, we'd flat out not believe it? But that's really just idle speculation.

Regarding whether we can formulate all physical explanations, this recalls McGinn's distinction between 'physicalism' and 'physics-alism'---the former being the metaphysical stance that everything is, ultimately, physical in nature, and the latter the epistemological stance that everything admits of explanation in terms of the science of physics. He makes the point that the former doesn't necessarily imply the latter, and that really, we don't have much of a reason to believe the latter ought to be true, save from a certain kind of epistemic hybris. Personally, I think he's got a point there (although I probably would have vigorously rejected it when I started out studying physics).

Anyway, thanks again for your answer, it's really helped put things in focus for me!

Dear Avtar,

Thank you very much!

Your idea about the fundamental being the final, zero state of the universe sounds interesting. Why the final state instead of the initial one, or are they the same? Would it be a quantum state of spacetime? Would it be subject to fluctuations?

I'll try to check your essay before the deadline, but please forgive me if I run out of time, since I have a lot to get through at the moment.

Best,

Karen

Karen,

A quick addendum to my comments above, which is a hypothesis:

In the absence of perturbative opportunities, the computational costs of fully formal methods for complete, end-to-end solutions trends towards infinity.

The informal proof is that full formalization implies fully parallel combinatorial interaction of all components of a path (functional) in some space, that being XYZ space in the case of approaching an electron. The computational cost of this fully parallel optimization then increases both with decreasing granularity of the path segment sizes used, and with path length. The granularity is the most important parameter, with the cost rapidly escalating towards infinity as the precision (inverse of segment length) decreases towards the limit of representing the path as an infinitely precise continuum of infinitely precise points.

Conversely, the ability to use larger segments instead of infinitesimals depends on the scale structure of the problem. If that scale structure enables multiscale renormalization, then the total computational cost remain at least roughly proportional to the level of precision desired. If no such scale structure is available, the cost instead escalates towards infinity.

But isn't the whole point of closed formal solutions is that they remain (roughly) linear in computational cost versus the desired level of precision?

Yes... but what if the mathematical entities we call "formal solutions" are actually nothing more than the highest-impact granularities of what are really just perturbative solutions made possible by the pre-existing structure of our universe?

Look for example at gravity equations, which treat stars and planets as pointlike masses. However, that approximation completely falls apart at the scale of a planet surface, and so is only the first and highest-level step in what is really a perturbative solution. It's just that our universe is pre-structured in a way that makes many such first steps so powerful and so broadly applicable that it allows us to pretend they are complete, stand-alone formal solutions.

So, I'll end for now with an even more radical hypothesis:

All formal solutions in physics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing "lumpy" structure of our universe.

So... ah... hmm! No, I'm not done. The above hypothesis is not radical enough. One more issue needs to be addressed.

Human cognition must rely on bio-circuitry that has very limited speed, capacity, and accuracy. It therefore relies very heavily in the mathematical domain on using Kolmogorov programs to represent useful patterns that we see in the physical world, since a Kolmogorov program only needs to be executed to the level of precision actually needed.

Furthermore, it is easier and more compact to process suites of such human-brain-resident Kolmogorov programs as the primary data components for reasoning about complexity, as opposed to using their full elaborations into voluminous data sets that are more often than not beyond neural capacities. In addition to shrinking data set sizes, reasoning at the Kolmogorov program level has the huge advantage that such program capture in direct form at least many of the regularities in such data sets, which in turn allows much more insightful comparisons across programs.

We call this "mathematics."

The danger in not recognizing mathematics as a form of Kolmogorov program creation, manipulation, and execution is that as biological intelligences, we are by design inclined to accept such programs as representing the full, to-the-limit forms of the represented data sets. Thus the Greeks assumed the Platonic reality of perfect planes, when in fact the physical world is composed of atoms that make such planes flatly impossible. The world of realizable planes is instead emphatically and decisively perturbative, allowing the full concept of "a plane" to exist only as unobtainable limit of the isolated, highest-level initial calculations. The reality of such planes falls apart completely when the complete, perturbative, multi-step model is renormalized down to the atomic level.

That is to say, exactly as with physics, the perfect abstractions of mathematics are nothing more than top-level stages of perturbative programs made possible by the pre-existing structure of our universe.

The proof of this is that whenever you try to compute such a formal solution, you are forced to deal with issues such as scale or precision. This in turn means that the abstract Kolmogorov representations of such concept never really represent their end limits, but instead translate into huge spectra of precision levels that approach the infinite limit to whatever degree is desired, but only at a cost that increases with the level of precision. The perfection of mathematics is just an illusion, one engendered by the survival-focused priorities of how our limited biological brains deal with complexity.

The bottom line is this even broader hypothesis:

All formal solutions in both physics and mathematics are just the highest, most abstract stages of perturbative solutions that are made possible by the pre-existing "lumpy" structure of our universe.

And looking at what I just wrote... yes, will be so bold as to assert with a high level of certainty that the above hypothesis is correct.*

In physics, even equations such as E=mc2 that are absolutely conserved at large scales cannot be interpreted "as is" at the quantum level, where virtual particle pairs distort the very definition of where mass is located. E=mc2 thus more accurately understood as a high-level subset of a multi-scale perturbative process, rather than as a complete, stand-alone solution.

In mathematics, the very concept of an infinitesimal is a limit that can never be reached by calculation or by physical example. That makes the very foundations of real mathematics into a calculus not of real values, but of sets of Kolmogorov programs for which the limits of execution are being intentionally ignored.

Given the indifference and often lack even of awareness of the implementation spectra that are necessarily associated with all such formalisms, is it really that much of a surprise how often unexpected infinities plague problems in both physics and math?

Cheers,

Terry

----------

* This real-time, on-the-fly restructuring of all of physics and mathematics has been brought to you courtesy of The FQXi Essay Program, 2017, which has encouraged just this kind of re-examination of fundamentals by folks like yours truly. How's that for blame shifting?... :)

Space and time are the concepts introduced into science by we humans in order to enable us to study nature as it is seen and sensed by us. our instruments. How can we rely upon is a question an alien may ask us!One needs to be an external observer in order to picturise the universe we live within! I wonder we talk of other verses and have been attempting Artificial Intelligence a discipline to deal with it! Reading your essay was refreshing but it made me wonder if we can go wrong in conptualising the postulates of the scienctific methodology er have adopted to study it. Is it conditioning us in some way the free will considered essential philosophy to develop innovatively any subject of study. I am in minority as an experimentalist in this context, as most scientists participating are thereticians and depend on Maths as their tool for progress in science! Can we innovate our tools of operation now? Are these the best options we have already chosen? If you find time, you may visit our very short essay of 2/3 pages and provide us your comments and rating on the same

    Karen,

    My essay leads off with a discussion of causality, in which I refer to

    1. J.B. Hartle, S.W. Hawking and T. Hertog, "The Classical Universes of the No-Boundary Quantum State" hep-th/0803.1663v1 March 2008. which formulates a causal particle (lacking QC/ED) and

    2. N. Seiberg, L. Susskind and N. Toumbas, "Space/Time Non-Commutivity and Causality", hep-th/0005015v3, May 2000. which sets the criteria (which btw Std Model does not meet, thus the meekly stated criterion).

    So yes, I very much meant a particle theory mathematically consistent with GR. Not good news for SU(3)xU(2)xU(1), since the consistency criteria makes similar stipulations on the form of the particle |H> (representation algebra).

    The criteria for consistency stems from 3. G. Takeuti, Proof Theory, Dover Publications, 1975. which requires that a particle be finite, and of course that there be no infinities in the theory. The reason that we renormalize is that the point-like approximation induces an infinity/infinity (that cannot be resolved by L'Hopitals theorem), which is removed by imposing a 'fudge factor' -actually several of them. { Of course I didn't call it that to Prof Gell-Mann when he came to OSU to pitch the SSC! ;}

    String theory meets the finitary criteria, has a consistent mass formulation, but can't find QC/ED without a stiffness-induced quantum state MAP. {Just had to work the word map in there ;} There are several examples of finitary representation geometries being studied... I won't trouble to list more.

    As far as replicating QC/ED (QFT) and GR, the criteria place a strict caveat on that as well, so don't expect your dad's GR to emerge. GR is the least affected since all that one needs is to correctly interpret the temporal curvature term as an imaginary quantity. The particle theory folks have to convert to a cross product of two wreath products as their representation algebra, to be sure they don't like it. Although it is a remarkably simple and beautiful algebra... which can and has been taught to HS students.

    So insofar as you put it "the older theories should be derivable from the new one, under some relevant conditions", yes. But sadly for nearly everyone, NOT the reverse. Here I recommend you also read Sabine Hossenfelder's essay.

    I discuss and present for consideration a foundational formula at the end of my essay, so perhaps read it from the end backward. I approached the topic as sort of a "deep dive" to reach most fundamental insight toward the end, with a fairly rapid ascent to address some old questions that are very much within bounds of the contest topic.

    The "relevant conditions" mostly have to do with the space-time average SCALE. {to derive GR eqn, set time to "now +/- 100 years" the foundational QC/ED state algebra averages out, as is well-known ;}

    Wayne

    Invite me out for a seminar? I have an hour or two of abstract formal discussion on all mathematically related topics.

    p.s. Interestingly, you add "unique". Here there is strict mathematical meaning, which, when imposed, makes the need for 'several' fundamental criteria a bit of overkill. The reason being that in math, virtually all uniqueness proofs involve a cyclic variable. Euler's equation for a circle, that is e^(i theta)... in which I choose theta as thetamass-time. There is no other theory with a true cyclic variable.

    Hi Eckard,

    Before I retired, my work for the US DoD was almost entirely in the open, since my main role was trying to promote funding for universities and small businesses for a variety of technology areas, including quantum.

    What's happened with quantum entanglement is that one use of it, entanglement for encrypting communications, is comparatively much easier (it's still very hard!). It is a sufficiently solved problem that you can now buy commercial communications encryption boxes. Look up for example the company for example Quantique, at https://www.idquantique.com/, or just do a Google search.

    Quantum computing in contrast is incredibly more difficult, mostly because instead of just keeping a single pair of photons quantum, you have to keep an entire computer quantum. That is not easy!

    But more importantly, because of the huge commercial potential of such devices, the commercial sector has begun investing levels of money that government research groups cannot even begin to compete with. Companies like Google and IBM are where the action is there, not in federal programs. When an area gets hot commercially, government research programs inevitably lose people. I watched that happen first hand when robotics suddenly got "interesting" to the private sector. All of our best demos, in particular Boston Robotics, disappeared!

    So, I just wanted to let you know that to the absolute best of my knowledge there is nothing weird going on for quantum computing, and I say that as someone who understands the physics there pretty well. It's just really, really hard... and the solutions to it are and will continue to be far more likely to come from the enormously larger pots of money available from the private sector than from government programs.

    Cheers,

    Terry

    Dear Terry,

    Thank you for your (almost) convincing arguments. Kadin and McEachern/Traill might deal with them. Let me just briefly reveal my reasons to be unsure:

    It begun with Pauli's statement that QM is the first disciplin that cannot replace the imaginary unit. I found a strange change in the 1920 decade. They suddenly dropped the Re( ) operation without giving an explanation. Schrödinger had revealed (in the 4rd communication) how he heuristically arrived at his complex wave equation. For a while I was puzzled why Heisenberg/Born's Hermitical square matrices are identical to Schrödinger's picture.

    This essay contest provided two insights to me:

    - The implicite assumption of a phase relative to the chosen reference point t=0 was made about a decade before Heisenberg and Schrödinger. They just herited it.

    - Watson pointed me to a paper by Fröhner on a theorem by Riesz-Fejér that links probability theory and quantum mechanics.

    I do neither see me a Sherlock Holmes nor able to investigate further. I am just a bit pedantic when it comes to the correct use and interpretation of complex calculus.

    May someone else question the superposition principle, i.e. the need to work with interfering wave functions, the absolute squares of which are probabilities? Shut up and calculate? OK, as soon as the fundamentals are safe.

    Cheers,

    Eckard

    Dear Karen,

    Thank you very much for the answers and comments, and for the references, which I didn't know and I think are relevant.

    You said "So I understand your suggestion as being that the key to finding a more fundamental theory is to first find the "true" formulations of our current theories?"

    Yes. I think we should extract the true lessons and to rethink the assumptions we made. We understood physics in a certain historical path, and we used what we knew at that time to formulate and develop it. This may introduce problems and limitations. So if we extract the essence of each principle, we can generalize it. Then we can intersect the generalizations of different principles, e.g. distilled from GR and QM, and find the class of theories where they both apply. This may lead to an exhaustive search, but it is possible that once we get them better many options will be eliminated. For example, no-go theorems like Bell's eliminate a large class of models.

    Another way to reduce the possibilities is to look for rigid models, which don't have replaceable parts. For example Clifford algebras are associated to the metric, and there's a Clifford algebra which includes a typical generation of leptons and quarks, with their exact symmetries. Because it is a simple algebra, it is difficult to change it. Without such a structure (although maybe not this one) there's too much freedom to choose the symmetry groups and the representations, hence the matter fields.

    To accomplish this generalization, a very general framework may help. I think the most natural and general common framework for all theories in physics is a sheaf theoretic formulation. The sheaf framework applies to relativistic and nonrelativistic, continuous and discrete, classical and quantum, and in fact can go much beyond these options allowing us to use locales and topoi. All theories in physics can be cast in this framework and allow generalization, and generic proofs similar to those in category theory, which may allow discussions at a metaphysical and metatheoretical level without being committed to a particular model. At the same time, sheaves emphasize the need to take into account global and topological properties of the solutions, which I think is essential for the problems of quantum mechanics (one major interest I have in exploring the possibility of a single world no collapse quantum mechanics, the obstruction being that quantum measurements seem to impose inconsistent constraints on the solutions of the Schrödinger equation. Sheaves provide powerful tools to study the obstructions to extending the local solutions to global ones). I started developing this 10 years ago, and I wrote something. I didn't submit it to a journal because I wanted to develop it more, but then I moved to other things. One reason is that I think sheaf theory is too general and allows too many options for one person to explore, so I decided to try more direct ways while keeping the framework as a tool for thinking about those.

    You said "it would seem to turn everything around and go in from the opposite direction". Yes, for example it is thought that QG will resolve the singularities, but the opposite is possible too, my (purely GR) treatment of singularities gave automatically several types of dimensional reduction proposed in other approaches in a rather ad-hoc way with the purpose to make quantum gravity renormalizable. So it was as you said, going in the opposite direction. But I am not satisfied enough with this, because I think it doesn't say what's the real theory behind. Every time there was real progress in fundamental physics, it was because of better understanding, better mathematics was revealed, a more rigid one, unifying different aspects.

    I think much can be learned from researching the various mathematical structures involved. I think topology is essential (e.g. Wheeler's geometrodynamics, in particular "charge without charge", and even if spacetime is topologically trivial topology is relevant in many other ways). (topology may lead to the impression that we focus on the continuum and ignore the option that spacetime is discrete, but in fact the topology of manifolds was understood starting from Euler's polyhedral formula, and the simplicial complex approach to manifolds is still essential in the study of their topology.) The differential structure also may be important, see the work of Torsten Asselmeyer-Maluga. The causal (or conformal) structure is also essential, in particular the Standard Model without the Higgs has conformal symmetry. Topology revealed connections between topological properties and curvature, I think this is a place to understand how matter unifies with gravity to see what's beyond the Einstein equation.

    Thanks again for the discussion, and for the references. I also looked into your book's summary, I think it is great!

    Best wishes,

    Cristi Stoica, Indra's net

    Hi Thomas,

    You're right of course that we can think about and discuss abstract non-spatiotemporal properties and entities; I suppose I meant "imagine" in a more vivid sense, of picturing things. I will amend this in the next version of the essay, thank you.

    I didn't say you'd misquoted Einstein. But I apologise for the tone of my last response, and for not backing up my statements. Here is a reference to a relevant paper by a well-respected Einstein scholar: https://www.pitt.edu/~jdnorton/papers/Fateful_Prejudice_Final.pdf

    Your quote is discussed in Section 3.4 (pp. 48-50), which talks about Einstein's lifelong objections to absolute properties (i.e., those of absolute space you cite), and his frustration that special relativity privileged particular reference frames (and thus possessed some of these objectionable absolute properties).

    Dear Scott,

    Thank you! And best of luck in your larger project.

    Karen

    Dear Karen,

    I read your paper and found it very interesting in many ways. If taken strictly as presented the nine conditions could possibly identify a most fundamental theory depending on how they are interpreted. As an example, since the universe is constructed as a structural substance hierarchy, although it is possible to generate a complete theory that covers all of the structuring of all of those structural levels, it could be somewhat difficult to work with. It might be easier to still compartmentalize the total theory into parts that specifically deal with each hierarchical level or the interactions between two levels, etc. as needed to minimize the amount of work that would be required to implement it at specific levels. At the same time the overall theory could describe the overall material structural generations, their actions, and their interactions throughout the total complex structure of the universe. Also since some structures, such as energy photons and field structures remain essentially the same throughout all of the hierarchical levels of structure, these entities and their structures and functioning could be considered background fixed structures, but they are existent parts of the universe that would necessarily be a part of any truly fundamental theory.

    The most interesting part of your paper to me is your description of the hierarchical tower of theories at different size scales of the universe. You do not mention that there are different hierarchical structural levels of the universe in which basic substance(s) of one level is used in that level to construct what is then the basic substances of the next larger level, etc. At the lowest structural level that man has currently gained an understanding, matter particles are the level's basic substances and are structured together with field structures to form all of the atoms. At the atomic level, the substances of the atoms are structured together with field structures to form the molecules and at the molecular level the substances of the molecules are structured together to form the substances of the large scale objects that we generally work with at our hierarchical level, etc. As you progress up the hierarchical chain, the number of different structures that are produced within each level increases. The lower the level is, therefore, the simpler or more fundamental it is than those levels above it. Any truly fundamental theory would, therefore, need to be able to explain the complete construction of all of the structures at all of the levels in the total structure and the complete progression of substance production from the first level through the last level. One reason that currently accepted theories such as QFT and GR, etc. can't be truly fundamental is that they do not address the substance or the structuring of that substance that produces the the basic entities of matter particles, energy photons, and fields, which make up the currently known lowest hierarchical level of the universe. This is very odd, since there is now (and has been for some time) adequate information in both observational information and also within the mathematical constructions of current theories to allow these things to be extrapolated and understood. It appears that this area has been purposely avoided. I can see reasons why that may be the case, but it is holding back man's progression because hidden within the internal structure of matter particles is the key that can free man from the limited scale problems that you mention. When these things are understood by man, some of the parts of both QFT and GR that man currently considers to be important will be seen to be in error. This will simplify the remaining parts and allow a complete workable theory to be developed. The real question is how long will man hold back this development for what amounts to petty reasons?

    Sincerely,

    Paul

      Awaiting response of Author, Jaren to xomments of mine and Paul !

      • [deleted]

      Dear Karen,

      Good to see that your Aussie connections have done no harm: for your excellent essay hits the spot as I work on "wholistic mechanics" (WM), a classical/deterministic reformulation of physics in spacetime. WM = {CM, SR, QM, GR, QFT, QG, EFT, ...|TLR}, my essay being an introduction.

      Identifying your nine conditions as KC-1 to KC-9, it was the last -- no weirdness -- that got me started; ie, I studied EPRB, the experiment analysed in famous Bell (1964), not accepting that the assumptions behind Bell's theorem (BT) were valid in that setting, and rejecting nonlocality.

      My starting premiss (my classical boundary condition) is true local realism (TLR): the union of true locality (no influence propagates superluminally, after Einstein) and true realism (some existents may change interactively, after Bohr).

      Revising EPR's naive definition of "elements of physical reality", I find determinism in play, refute Bell's theorem, and (from first principles, in spacetime) find the Laws of Malus, Bayes and Born validated. Born's law (an effective field theory, in my terms; in the space of probability amplitudes) can then be tested by confirming the correct result for the EPRB expectation; then the correct DSE results; then onward to the stars.

      In thus eliminating "wavefunction collapse" and nonlocality from QM, it follows that such weirdness need no longer trouble the foundations of QFT; etc. And since my calculations are conducted in spacetime (not Hilbert space), I'm thinking QG is covered automatically.

      Enough: such is my long way of saying that I will welcome your comments at any time.

      With thanks for your stimulating essay, and with best regards from down-under,

      Gordon Watson (determined and free-willed)

        Apologies: it appears reCAPTCHA logged me out when it malfunctioned! GW

        Dear Karin,

        Yes, for the reason that physics has taken to predicting pointer positions from which it conjures up sociologically desirable 'worlds'. Your essay reminds me of the trend in many sciences to deal with secondaries like methodology and the definition of what would count as progress and what it is supposed to look like.

        Since Feyerabend we know that Everything Goes (provided it goes), because otherwise we logically overdetermine the problem and thereby prevent any solution. In other words, physics got stuck in the analytical abracadabra of pseudo-empirism.

        Heinrich