Hi Philip,

Your hypothesis is that quantum information is fundamental and all material entities including spacetime are emergent. But do we need something more fundamental than the spacetime itself? In my opinion (presented in my essay http://fqxi.org/community/forum/topic/1609) we do not. I propose a simple experiment to prove that particles, fields and information are the same i.e. they are only spacetime deformations but perceived and defined by human beings in different ways.

My prediction is not very exotic and new but is generally a conclusion of general relativity (at least its great discovery that gravity is not a force but only a spacetime geometry). I ask why not to apply the same rule to another "forces" and find them somewhere in spacetime geometry? Using your words ...that would be the amazing power of consistency. That simple question should be followed by a complicated answer. The answer is an universal metric. As we do not have any, I do not want to wait and instead I propose to start from a prediction of the idea itself and the experiment that could falsify that prediction or confirm it. In the latter case the time will come to look for the metric, being at least sure of its existence.

We are human beings so we depend on our perception abilities. Everything we perceive is a creation of our minds. We call it Reality. If we want to be sure the Reality exists and not only in our minds we desperately need a real experiment. Creating highly sophisticated ideas like holographic principle does not change that fact. Do the holographic idea or string theories give any prediction?

We need the experiment outcome and maybe than we could propose even strongest equivalence principle claiming that any interaction is entirely geometrical by nature.

If you are looking for energy conservation you will find it assuming that the spacetime is the fabric of everything i.e. particles and field forces. A spacetime deformation needs energy and is the energy itself. That is not my idea but that is GR. Then any randomly chosen region of spacetime gives you that perfect conservation needed. Every entity e.g. a particle is spread out over the entire spacetime (or Universe if you like) according to the Gaussian distribution that is a continuous probability distribution with the apparently strongest deformation (spacetime density?) in the center of the observed entity.

Thanks

    "Measure the strength of the gravitational field and you have info on how much mass is in the hole. Is this relevant to the definition of "information"?"

    Yes, this is a point I am making in my essay. The gravitational field around a black hole can tell you its mass, momentum, angular momentum and position. That's ten numbers. The electric field can tell you its charge. Other gauge fields would give more information. If there were a huge hidden gauge symmetry it may be that all information could be accounted for in this way. That could be how holography works.

    I think you are right that Hawking radiation is not completely random. If the BH is charged it will at some point radiate away that charge. However the radiation should be thermal with energy distribution of a black body. To resolve the information loss problem we may need to accept that it is non-random in other more subtle ways.

    • [deleted]

    Just a few more comments on the black hole Hawking radiation information claims. (Apologies if this wastes space, please delete if it seems off topic.)

    The idea that heavy positive ions may be more likely to fall into a black hole is conventional fractionation of ionized gas in a gravitational field. If a large cloud of gas is falling into a black hole, it accelerates and heats up, being ionized. The positive ions in this gas plasma have less kinetic energy than the lighter electrons, so the heavy ions effectively fall faster in the gas and enter the black hole first. Obviously in a vacuum, all masses fall at the same rate, but in a gas the more massive, highly inert molecules move more slowly and so end up at the bottom, while the lighter ones pick up greater velocity in collision and end up preferentially at higher altitudes as explained by Maxwell's kinetic theory of gases. E.g., hydrogen gas rises in the atmosphere while heavier molecules concentarte at sea level.

    There are issues with Hawking's claim that all black holes radiate at a rate simply dependent on 1/mass. Schwinger's vacuum polarization calculation for the running coupling in QED shows that there's a 0.5 MeV threshold (IR or low energy cutoff) on polarizable pair production, corresponding to an electric field of ~10^18 v/m. Below that electric field strength, there's a sharp exponential fall spontaneous pair production, which would prevent Hawking's radiation mechanism from working. Hence, if Hawking's heuristic mechanism for Hawking radiation (spontaneous pair production at the event horizon) is true, then Schwinger's experimentally verified QED calculation of the magnetic moment of leptons necessitates the condition that you need to have >10^18 v/m electric field strength at the event horizon of a black hole, or it won't radiate Hawking radiation. So a black hole must have a massive electric charge to radiate Hawking radiation, a fact that isn't mentioned by Hawking or included in his equation! The only ways around this would be to either forget Hawking's heuristic mechanism (that's easier for mathematicians than physicists), or else for quantum gravity (rather than existing QFT) to provide the energy density for spontaneous pair production in the vacuum.

    So it seems to that a particular quantum gravity theory is needed in order to validate the mechanism for Hawking radiation. The energy density of an electric field is the product of half the vacuum permittivity and the square of the electric field strength, and since for protons the gravitational field is about 10^40 times weaker than the electromagnetic field, you can estimate pair production in a quantum gravity field by scaling from electromagnetism. But there is the question of whether the gravity coupling runs with energy as assumed in supergravity, or not. If couplings do unify at the Planck scale, the difference in EM to gravity force strength for fundamental particles will decrease from 10^40 to 1 as energy (or inverse of distance) increases. Fundamental particles, if treated as black holes, should radiate intensely due to their small mass. Is the physical basis of gauge theory, the exchange of offshell gauge bosons between charges, causing fundamental forces?

    Jecek, welcome to the contest, I will be reading the new essays later today.

    • [deleted]

    Hi Philip,

    I have one question and one request. I have been reading your papers and trying to see what you are trying to do exactly. The use of event symmetry is interesting, but I am not clear about multiple quantization and path integral, could you clarify it please.

    The discussion of this thread and with Jochen is most interesting for me. Now, I don't want to make this thread about my theory, but I would like a line or two worth of feedback; can you see any link to your ideas. My theory is very platonic, it links space, energy , matter all in one concept, time is a change of state, it does not appear explicitly. As you can see the lagrangian falls out of the system for the Bohr like model and you get the usual relation between c,h_bar and alpha. I have many other results that I have not shown, like g-factor, Fine Structure Constant and full QM hydrogen 1S (in it, if you change the proton width-very closed to measured- even a little the energies come out wrong). I hope I can show all that in time for this contest.

    Philip

    No I do not object to words, per se, and I am always searching for the underlying correspondence with reality.

    So space is not emergent then, given that definition, because physical existence involves space. Neither is time emergent, assuming what it relates to is understood, because there definitely is difference in physical existence. And difference involves a rate at which that occurs. Without space nothing could be existent, and without change nothing would differ, which must occur at a rate.

    Certainly we can only 'assign' space via entities, and we can only calibrate time via entities, but surely that is not the point about emergent? So I am now really lost.

    In the meantime I will re-read your essay .

    Paul

    Nige, it seems to me that you might be confusing together some unrelated things. Hawking radiation has nothing to do with pair production in an electric field. It is an effect coming from the horizon. The "heuristic" explanation involving virtual pair production that Hawking and others have used in popular explanations does not really represent the detailed calculation that Hawking did. That was based on methods of semi-classical quantum gravity that are hard to describe well in general terms.

    Quantization is the procedure that takes you from a classical theory to a quantum theory by replacing classical observables with non-commuting observables in a specific way (as formulated by Dirac) An equivalent procedure is to use path integrals as defined by Feynman. Multiple quantisation is the idea that this can be repeated iteratively if you treat the quantum shoredinger equation as if it was a classical field equation and quantise again. The dream is that if you define quantization in a very gebneral algebraic way you can derive physics with just this idea.

    I had a quick look at your theory and of course it has a lot in common with other frameworks including mine. I like the idea of trying to simulate systems because I started out in Lattice Gauge Theories. I cant give you a proper review here but I think it would be good if you could write an essay and submit it here. You may then get some feedback from other authors.

    • [deleted]

    I think the view that we live in "An Acataleptic Universe" (an ancient Skeptical view that no more than probable knowledge is available to human beings) is a mistake. In the light of the quantum mechanical understanding that there is no certainty, only probability, this is an understandable mistake, but an error nevertheless. What is needed is comprehension of quantum mechanics. Einstein did not actually say it was wrong - he agreed that it's correct as far as it goes. But he thought it was very, very incomplete ... and didn't go far enough. You and I may philosophically (or acataleptically) differ, but I'm sure we agree that modern science's understanding of quantum mechanics cannot be called complete. However, that doesn't mean it will never be complete.

    "Hidden variables" is an interpretation of quantum mechanics which is based on belief that the theory is incomplete (Albert Einstein is the most famous proponent of hidden variables) and it says there is an underlying reality with additional information of the quantum world. I suggest this underlying reality is binary digits generated in 5th-dimensional hyperspace.

    I think this information underlying reality can borrow a few ideas from string theory's ideas of everything being ultimately composed of tiny, one-dimensional strings that vibrate as clockwise, standing, and counterclockwise currents in a four-dimensional looped superstring. We can visualize tiny, one dimensional binary digits of 1 and 0 (base 2 mathematics) forming currents in a 2-dimensional program called a Mobius loop - or in 2 Mobius loops, clockwise currents in one loop combining with counterclockwise currents in the other to form a standing current. Combination of the 2 loops' currents requires connection of the two into a four-dimensional Klein bottle* by the infinitely long irrational and transcendental numbers. Such an infinite connection translates - via bosons being ultimately composed of 1's and 0's depicting pi, e, √2 etc.; and fermions being given mass by Einstein's 1919 proposal of gravitational and electromagnetic bosons interacting in what quantum mechanics calls "wave packets" - into an infinite number of Figure-8 Klein bottles composing the infinite universe (according to the known definition of "infinite"). (Each Klein bottle, whose gaps are deleted by the flexibility of binary digits to form a continuous space-time where surrounding subuniverses fit together perfectly, is a "subuniverse" - and we live in a finite, 13.7 billion year old, subuniverse.)

    * This Klein bottle could possibly be a figure-8 Klein bottle because its similarities to a doughnut's shape suggests an idea of mathematics' "Poincare conjecture". The conjecture has implications for the universe's shape and says you cannot transform a doughnut shape into a sphere without ripping it. One interpretation follows: This can be viewed as subuniverses shaped like Figure-8 Klein Bottles gaining rips called wormholes when extended into the spherical spacetime that goes on forever (forming one infinite superuniverse which is often called the multiverse when subuniverses - which share the same set of physics' laws - are incorrectly called parallel universes which are wrongly claimed to each possess different laws). Picture spacetime existing on the surface of this doughnut which has rips in it. These rips provide shortcuts between points in space and time - and belong in a 5th-dimensional hyperspace. The boundary where subuniverses meet could be called a Cosmic String (they'd be analogous to cracks that form when water freezes into ice i.e. cosmic strings would form as subuniverses cool from their respective Big Bangs).

    "Empty" space (according to Einstein, gravitation is the warping of this) seems to be made up of what is sometimes referred to as virtual particles by physicists since the concept of virtual particles is closely related to the idea of quantum fluctuations (a quantum fluctuation is the temporary change in the amount of energy at a point in space). The production of space by BITS (BInary digiTS) necessarily means there is a change in the amount of energy at a certain point, and the word "temporary" refers to what we know as motion or time (in a bit-universe, motion would actually be a succession of "frames"). Vacuum energy is the zero-point energy (lowest possible energy that a system may have) of all the fields (e.g. gravitational / electromagnetic / nuclear) in space, and is an underlying background energy that exists in space even when the space is devoid of matter. Binary digits might be substituted for the terms zero-point energy (since BITS are the ground state or lowest possible energy level) and vacuum energy (because BITS are the underlying background energy of empty space).

    I call hidden variables (or virtual particles) binary digits generated in a 5th-dimensional hyperspace which makes them - as explained in the next sentence - a non-local variety, in agreement with the limits imposed by Bell's theorem. (Bell's Theorem is a mathematical proof discovered by John Bell in 1964 that says any hidden variables theory whose predictions agree with quantum mechanics must be non-local i.e. it must allow an influence to pass between two systems or particles instantaneously, so that a cause at one place can produce an immediate effect at some distant location [not only in space, but also in time].) Comparing space-time to an infinite computer screen and the 5th dimension to its relatively small - in this case, so tiny as to be nonexistent in spacetime (at least to observation) - Central Processing Unit, the calculations in the "small" CPU would create and influence everything in infinite space and infinite time. This permits a distant event to instantly affect another (exemplified by the quantum entanglement of particles separated by light years) or permit effects to influence causes (exemplified by the retrocausality or backward causality promoted by Yakir Aharonov and others (see "Five Decades of Physics" by John G. Cramer, Professor of Physics, University of Washington - http://www.physics.ohio-state.edu/~lisa/CramerSymposium/talks/Cramer.pdf). This means quantum processes, in which effects and causes/distant events are not separated, wouldn't be confined to tiny subatomic scales but would also occur on the largest cosmic scales.

    My shortened entry in FQXi's current contest (the complete version is at vixra.org and researchgate.net) does refer to the Big Bang, actually to Big Bangs, but the article is not suggesting these came from nothing. It refers to a nonlinear concept of time where inhabitants of this universe learn about the cosmos then apply that knowledge to produce the Big Bang which did not form the universe but only our infinitesimal local part of it (our subuniverse). Applying the knowledge gained over thousands of years to our local Big Bang requires time travel to 13.7 billion years ago. The method of doing this (via a 5th-dimensional hyperspace) was written in detail in my original entry - but was unfortunately one of the things deleted to meet FQXi's length restrictions.

    The inverse-square law states that the force between two particles becomes infinite if the distance of separation between them goes to zero. Remembering that gravitation partly depends on the distance between the centres of objects, the distance of separation between objects only goes to zero when those centres occupy the same space-time coordinates (not merely when the objects' sides are touching i.e. infinity equals the total elimination of distance - the infinite cosmos could possess this absence of distance in space and time, via the electronic mechanism of binary digits).

    Certainly, zero-separation makes no sense at all if the universe is confined to the laws of physics we're familiar with. It seems to only be possible if, as FQXi's member Professor Max Tegmark states, we live in a mathematical universe. That is, if the sentence "Information has nothing to do with reality" is an incomplete description of reality.

    This known definition (of an infinite universe going on and on forever) is something observation and experiment might confirm. The definition of infinity as "elimination of distance" would depend on the 1's and 0's of the binary digits. Without this "companion" definition, time travel (whether into the future or past) would be impossible if it's an instant effect. With it, intergalactic space travel can become possible. And the companion can explain (by the elimination of distance) quantum entanglement in space, as well as time's retrocausality in which the future affects the past.

    Pardon me for getting carried away and writing too much, Phil. But I'm quite a perfectionist - if I write anything at all, I end up writing a lot because I have this need to explain every detail thoroughly. Right this minute, I'm worrying that I might have forgotten something :(

      Rodney, I was reading your excellent essay while you wrote this long comment.

      I do think that quantum theory requires some revision to deal with emergent spacetime and multiple quantisation, but I don't see any reason to expect in-determinism to disappear. I suppose someone has to try out the possibility though.

      Philip,

      Although I thoroughly appreciate your expansive answer I don't believe you directly addressed my critique! Perhaps the information in the question was garbled so allow me to clarify. In the physical sciences ontology refers to what actually exists whereas epistemology refers to our knowledge of what actually exists. My question utilizes ontology in the physical sciences sense not in the theoretical/computer science sense. The ontology of physical science is the foundation on which your "house" is built and this, of course, can include the Platonic realm (although proving it is a rather thorny problem).

      The definitions of George Ellis concern real world existence and are not controversial nor are they contrary to your "scaffolding"; they're really quite simple, elegant, and yet powerful. Clearly they demonstrate that the Hilbert Space has an ontological (real world) referent; what is that referent? And I have a problem with the Hilbert Space, or its real world referent anyway, being emergent. It seems to me that qubits emerge from and live on the Hilbert Space; at the very least they would seem co-dependent arising! It would seem clear that the real world referent of the Hilbert Space is the entity which "stores" the information necessary to enable quantum entanglement; elementary particles, photons, etc., would seem to lack the capability of remembering who they had lunch with yesterday, last week, or 14 billion years ago. And the experiments of Aspect et al. (not to mention Herzog et al.) would seem to render attempts to dismiss quantum entanglement with colored balls in the urn arguments impotent.

      In my FQXi entry (http://www.fqxi.org/community/forum/topic/1608) I reference a paper by William Tiller and Walter Dibble in which they demonstrate spatial "information entanglement" upwards of 6000 miles. In [TD] they demonstrate temporal "information entanglement" as well (see references therein for a plethora of empirical evidence supporting such spatiotemporal entanglement). Messrs. Tiller and Dibble explain these empirical results by expanding our scientific reference frame to a duplex reference frame such that the electromagnetic physical realm , probed by science these last 400 years, has a conjugate magnetoelectric information domain (call it mental). These domains can be coupled by what they call a deltron moiety and conjugate properties are related by deltron modulated Fourier transforms. The deltron moiety is a symmetry breaking mechanism with the amount of hidden symmetry a function of deltron density.

      And this is the heart of my critique. I'm very much sympathetic to the Platonist view and remain quite impressed with your Theory of Theories; I've pretty much read all of your papers. But how can you say "there is no point in asking where information comes from or where it is stored?" The scientific enterprise is built on such inquiries! And for certain there is no reason to censor your Platonic perspective, after all, you're in very good company and the question remains open [AHT].

      References

      [AHT] Alford, M., et al., Three theoretical physicists discuss ontology in: On Math, Matter, and Mind ( http://www.ids.ias.edu/~piet/publ/other/mmm.pdf), Foundations of Physics, Springer Science Media, 2006, accessed 30 April, 2013.

      [TD] Tiller, W. and Dibble, W., A mechanism for quantum entanglement in: What Is Information Entanglement and Why Is It So Important to Psychoenergetic Science? ( http://www.tillerfoundation.com/White%20Paper%20VIII.pdf), the Tiller Foundation, 2009, accessed 30 April, 2013.

      With regards,

      Wes Hansen

      When taken out of context my statement that "there is no point in asking where information comes from or where it is stored?" is too strong. My point was that if you are looking for a physical theory without the philosophical side then these questions are not necessary, in my opinion.

      The Hilbert space, qubits and entanglement are of course part of standard quantum mechanics. They are intrinsically non-local so they extend over the entire system under consideration, or entire universe if you like. I personally don't think they need some extra physical entity to explain how they work, but probably you are not alone in thinking otherwise.

      Phil,

      Thank you for your response. As I understand, you're like the Copenhagenists, satisfied with an epistemological theory. Of course Quantum Mechanics is the most successful theory of all time so epistomologically it cannot be messed with but it seems to generate more ontological questions than it answers. I have a hard time seeing how the Hilbert Space doesn't have a real world referent but, like you say, that's just me. I wish I had the mathematical sophistication that you do so I could engage in formidable debate. I'm working on it but it's a slow process - I'm no Kit Armstrong!

      With regards,

      Wes Hansen

      Wes

      "but it seems to generate more ontological questions than it answers"

      That is because it is ontologically incorrect, ie it presumes physical existence has uncertainty/indefiniteness/duplicity/whatever. This creates a self-contradiction, as physical existence must occur in a sequence of discrete, definitive, physically existent states. So in order to resolve that conundrum it has to invoke all sorts of flawed concepts in order to maintain the ensuing construct (all possibilities are somehow existent, observation affects the physical circumstance, etc).

      Its other 'defence' lies in the fact that the order which it overthrew in the first place, commonly referred to as 'classical', was not understood properly. This can be characterised as 'its changing'. But a change means that it is not the original it but some other it. The point being that 'its' were being deemed to exist at a higher level than how they actually occur, because existence was related to superficial physical features. In other words, physically, there is no such thing as chair, ball, etc, etc. Just a physically existent state of whatever at any given time, which, despite alteration, at a higher level appears to be the same 'thing'. Indeed, even if the alterations ae manifest, we still cling to the notion of it physically persisting, but rationalise it as the bush has lost its leaves/grown in size/etc. All of which is understandable to enable ordinary living, but physically incorrect.

      In other words, if the 'old order' ('classical') had been understood properly, then it has the essence of the 'new order' (QM) anyway. The only difference being that there is no presumption that physical existence occurs in a strange way. Which could be characterised as physical anarchy and is an impossible basis for physical existence.

      Paul

      An excellent essay. Very interesting treatment of the holographic principle

        Philip, I enjoyed your essay and your reasoning is correct when viewed from the accepted paradigm.

        Your statement: "Sometimes the most brilliant step towards a great discovery is asking the right question to begin with." is so true, it should have been written much firmer as "As a rule, the most brilliant..."

        If asking the right questions is the way forward in our development, then why are uncomfortable questions simply ignored or classed as irrelevant?

          Anton, thanks for your comments.

          I would ask the same question as you. I have been writing my ideas about physics for twenty years with very little attention paid to what I say. Of course it is easier to get heard if you are someone like Hawking who is known for earlier work and who holds a good position. For the rest of us we have to do what we can to tip the odds in our favour by asking our questions in the right language and in the right places, such as these contests. When nobody listens we must work harder to seek our own answers and make sure they are recorded in a permanent place. When someone else with more authority asks the same question you need to be ready to point to where you asked it first.

          I will read your essay next.

          • [deleted]

          I've been doing a lot of thinking, Phil. It looks like you were correct when you wrote the following to me - "I also still hold to Einstein's view that a mathematical theory is needed at the heart of physics". I've written a little thing called "Equatiom Describing the Universe" which makes me also think a mathematical theory is needed at the heart of physics -

          (the equation won't come out right here, but you can see the proper form at http://viXra.org/abs/1305.0030)

          Title -

          Equation Describing the Universe

          Author -

          Rodney Bartlett

          Abstract -

          Originally, I planned to call this article Hu= BEce , or 1 = 1 . But my computer won't let me save that name - so I've changed the title to "Equation Describing the Universe". This equation looks like the one physicists are hoping will be printed on T-shirts in the middle of this century as a description of the Universe. Normally, I'd leave development of this equation in the capable hands of Isaac Newton or Albert Einstein. They aren't here right now ... and it'll be quite a while before they return. However, they instructed me to send you this message on their behalf.

          H is for the Hamiltonian, representing the total energy of a quantum mechanical

          system. The subscript u stands for "universe" and Hu means the universe operates quantum mechanically (quantum effects operate macroscopically as well as microscopically, and this unification is symbolized by the first 1). BEc is for Bose-Einstein condensate, a finite form of matter that is the first known example of quantum effects becoming apparent on a macroscopic scale (represented by the second 1). Borrowing a couple of lines from the more complete explanation in the Content - "The infinite cosmos could possess this absence of distance in space and time, via the electronic mechanism of binary digits. To distinguish this definition from "the universe going on and on forever", we can call it "electronic infinity or e " (not E8). When the macroscopic quantum effects of the BEc are magnified by e , those effects are instantly translated into all space-time operating quantum mechanically. In other words, you can multiply a BEc (the second 1) an infinite number of times - but no matter how many (or how few) times you do it, you'll always end up with 1 (the macroscopic universe's time and space operating quantum mechanically). Consequent to this operation is the inevitable quantum entanglement of everything (matter, energy, forces); making all space and all time a unification.

          Content -

          "The universe IS something" ("Astronomy" magazine - March 2013, p.66) is interesting. This letter and its reply continue on from Bob Berman's article "Infinite Universe" ("Astronomy" - Nov. 2012) which says, "The evidence keeps flooding in. It now truly appears that the universe is infinite" and "Many separate areas of investigation - like baryon acoustic oscillations (sound waves propagating through the denser early universe), the way type 1a supernovae compare with redshift, the Hubble constant, studies of cosmic large-scale structure, and the flat topology of space - all point the same way." Support for the article - (after examining recent measurements by the Wilkinson Microwave Anisotropy Probe, NASA declared "We now know that the universe is flat with only a 0.4% margin of error." - http://map.gsfc.nasa.gov/universe/uni_shape.html;

          and according to "The Early Universe and the Cosmic Microwave Background: Theory and Observations" by Norma G. Sànchez, Yuri N. Parijskij [published by Springer, 31/12/2003], the shape of the Universe found to best fit observational data is the infinite flat model).

          Thinking about a finite cosmos makes my head hurt (if the cosmos is finite, what exists outside it? If there's something, that something must be part of the universe. If there's absolutely nothing, how can that be? Nothing doesn't exist.) But I can't really picture an infinite cosmos that never ends. A new definition of infinity is needed. The inverse-square law states that the force between two particles becomes infinite if the distance of separation between them goes to zero. Remembering that gravitation (associated with particles) partly depends on the distance between their centres, the distance of separation only goes to zero when those centres occupy the same space-time coordinates (not merely when the particles' or objects' sides are touching i.e. infinity equals the total elimination of distance). The infinite cosmos could possess this absence of distance in space and time, via the electronic mechanism of binary digits. To distinguish this definition from "the universe going on and on forever", we can call it "electronic infinity or e ".

          1's and 0's would make the bosons of gravity and electromagnetism which would interact in Wave Packets to produce matter. All matter in the universe then has the potential to behave like a Bose-Einstein condensate (a state of matter composed of bosons cooled close to absolute zero in which atoms fall or condense into the lowest accessible quantum state, at which point quantum effects become apparent on a macroscopic scale). The bosons composing gravity and EM can all have the same properties e.g. position, velocity, magnetism and spin (force-carrying particles, or bosons, defy Pauli's exclusion principle). The matter we know obeys Pauli's exclusion principle. So how is it different from a Bose-Einstein condensate. To exhibit Bose-Einstein condensation, the fermions (particles of matter) must "pair up" (not in the normal manner of sharing electrons) to form compound particles that are bosons. This "pairing-up" may be achieved by using e-infinity to delete distance. This leads to a photon (such as from the Sun) experiencing the whole universe - including BECs, gravitons, and other photons - in its existence.

          It's impossible to point to the 4th dimension of time, so this cannot be physical. Since the union of space-time is well established in modern science, we can assume the 4th dimension is actually measurement of the motions of the particles occurring in the 3 dimensions of length, width, and height. The basic standard of time in the universe is the measurement of the motions of photons - specifically, of the speed of light. This is comparable to the 1960's adoption on Earth of the measurement of time as the vibration rate of cesium atoms. At lightspeed, time = 0 (it is stopped). Below 300,000 km/sec, acceleration or gravitation causes time dilation (slowing of time as the speed of light is approached). If time's 0, space is also 0 because space and time coexist as space-time whose warping (gravity) is necessarily 0 too. Spacetime/gravity form matter/mass, so the latter pair can't exist at lightspeed and photons are massless (even when not at rest).

          Suppose Albert Einstein was correct when he said gravitation plays a role in the constitution of elementary particles (in "Do Gravitational Fields Play An Essential Part In The Structure of the Elementary Particles?" - a 1919 submission to the Prussian Academy of Sciences). And suppose he was also correct when he said gravitation is the warping of space-time. Then it is logical that 1) gravitation would play a role in constitution of elementary particles and also in the constitution of the nuclear forces, and 2) the warping of space-time that produces gravity means space-time itself plays a role in the constitution of elementary particles and the nuclear forces. Gravity, being united with EM and the nuclear forces, is therefore the ultimate physical source of all repelling and attracting. Mass increase at increasing accelerations is inevitable because the object is encountering more spacetime and gravity (the producers of mass; which also confer mass's equivalent [energy] on cosmic rays that travel far enough through space, turning them into ultra-high-energy cosmic rays). But mass increase cannot become infinitely large since space-time, gravity and mass don't exist at lightspeed. The object is converted into energy which means mass and energy must be equivalent and Energy must equal Mass related to the Speed of Light (E=mc^2, in the words of Albert Einstein).

          Since there is zero, or no, spacetime at light speed; infinity exists in that state - all distances are totally eliminated and a photon experiences the whole universe - as well as all time - in its existence). "Physics of the Impossible" by Michio Kaku (Penguin Books 2008, p.227) says, ".. whenever we naively try to marry these two theories (general relativity and quantum theory), the resulting theory makes no sense: it yields a series of infinite answers that are meaningless." We see that infinite answers are supposed to be arrived at because light is important in Relativity and "infinity (in the sense of total elimination of distance) exists at light speed". Infinity and infinite answers are not barriers to uniting general relativity and quantum theory. When we realize that c=∞ (infinity exists at light speed), those infinite answers can yield not nonsense but real meaning.

          With all distances deleted and a photon experiencing the entire universe in its existence (including gravity and the nuclear forces - carried by the gravitons, gluons, W+, W- and Z0 particles), the cosmos has become finite (even subatomic or quantum sized). The "pairing up" of particles by e-infinity i.e. by the electronic binary digits of 1 and 0 permits matter we know to defy the exclusion principle and act as though it was buried at the centre of a planet. No gravity-EM interactions in wave packets occur at the planet's centre; meaning there is no mass* and, agreeing with conclusions from Isaac Newton's theories, (hypothetical) objects weigh nothing. Also, "pairing up" of particles by e-infinity means quantum effects become apparent on a large macroscopic scale. This permits a "distant" event to instantly affect another (exemplified by the quantum entanglement of particles separated by light years), or permits effects to influence seemingly separate causes (exemplified by the retrocausality or backward causality promoted by Yakir Aharonov and others). This means quantum processes wouldn't be confined to tiny subatomic scales but would also occur on the largest cosmic scales.

          * According to the Lagrangian - the L of a dynamical system which summarizes the dynamics of the system - fermions should be massless, and the common view is that it's the Higgs field/boson coupled to them that gives them their masses. There are several explanations for the creation of mass - Einstein's gravitational / electromagnetic interaction being used here.

          Why do fermions obey the exclusion principle if e-infinity (binary digits) pairs them up to exhibit Bose-Einstein condensation and quantum effects becoming apparent on a macroscopic scale? It must be because of temperature. The slightest interaction with the outside world can be enough to warm fragile BECs (they're normally very near absolute zero or -273.15 degrees C), forming a normal gas. Remembering that our world's average temperature is almost 290 degrees C above that, it's no surprise that the vibration from the heat splits the paired particles apart and causes them to obey the exclusion principle. Since this article refers to the 1's and 0's of base 2 mathematics (the binary system), physical explanation (heat splitting particles apart) isn't enough and a mathematical explanation (at least in a limited context) is desirable.

          Let's borrow a few ideas from string theory's ideas of everything being ultimately composed of tiny, one-dimensional strings that vibrate as clockwise, standing, and counterclockwise currents in a four-dimensional looped superstring. We can visualize tiny, one dimensional binary digits of 1 and 0 (base 2 mathematics) forming currents in a Mobius loop - or in 2 Mobius loops, clockwise currents in one loop combining with counterclockwise currents in the other to form a standing current. Combination of the 2 loops' currents requires connection of the two as a four-dimensional Klein bottle. This connection can be made with the infinitely-long irrational and transcendental numbers. Such an infinite connection translates - via bosons being ultimately composed of 1's and 0's depicting pi, e, √2 etc.; and fermions being given mass by bosons interacting in matter particles' "wave packets" - into an infinite number of Figure-8 Klein bottles.** Slight imperfections in the way the Mobius loops fit together determine the precise nature of the binary-digit currents (the producers of gravitational waves, electromagnetic waves, the nuclear strong force and the nuclear weak force) and thus of exact mass, charge, quantum spin, and adherence to Pauli's exclusion. Referring to a Bose-Einstein condensate, the slightest change in the binary-digit flow (Mobius loop orientation) would alter the way gravitation and electromagnetism interact, and the BEC could become a gas.

          ** Each one is a "subuniverse" composing the physically infinite and eternal space-time of the universe (our own subuniverse is 13.7 billion years old). We don't have to worry about accelerating cosmic expansion - the result of more space, forces, energy and matter being continually produced by binary digits - leaving our galaxy alone in space. As "dark energy" causes known galaxies to depart from view, more energy and matter can replace them (since the universe obeys fractal geometry, gravity is the source of repelling and attracting not only on a quantum scale but on a cosmic scale, too i.e. it accounts for dark energy - it accounts for dark matter and Kepler's laws of planetary motion, too [but that's a long explanation best left in http://vixra.org/abs/1303.0218]). The Law of Conservation says neither matter nor energy can be created or destroyed (though the quantity of each can change), so a better phrase might be "binary digits recycle spacetime" (when matter changes into energy or energy becomes matter, we commonly say matter or energy has been created). As well, other expanding subuniverses can collide with ours and their galaxies enter our space to keep our galaxy company.

            Rodney, thanks for the long comment. One bit I especially like is that the idea of particles from gravitational fields may return.

            By the way I fixed up the equation in your viXra abstract when you submitted it this morning. You can always put in the HTML yourself if you need equations to work. Here you can do it with LaTeX.