• [deleted]

"...If it is because mathematical logic or experiment confirms that they are right, then that's fine. If it is just because we have grown used to the formalism then we need to question if it is the right way to go forward..."

True, Philip, but the trouble is that we have become so clever at mathematics it is possible to frame experimental results in many different ways that seem to point in contrary directions. That is when the second part of your sentence kicks in..the formalism we have been educated in blinds us to other views that may be equally valid but not as 'fashionable'. I am far from being an expert but wonder if theories get shot down for contradicting some minor tenet of a rival theory, and not for contradicting its own premises or experimental results?

Sorry this is off-subject, but it is unfortunate the that name of the black-hole related "holographic principle" has sort of upstaged another possible holographic principle. Holograms have a unique property that if you cut a small portion of the film it will encapsulate the larger picture but at lower resolution. As in fractals, and sort of like how nature seems to work- the universe in a grain of sand sort of thing!

    • [deleted]

    Ervin, I don't think it "ought" to come out that way, but I think it might. Do you think it couldn't?

    The AdS/CFT correspondence only works for supersymmetric theory. For non-supersymmetric QCD it may be an approximation but I doubt it would lead to anything other than qualitative results.

    Verifying the holographic principle by direct observation may never be possible, just as we may never be able to detect Hawking radiation from a black hole. These things might be forever theories which are necessary to accept for consistency. I am not saying this has to be the case. There are other ways things could pan out.

    • [deleted]

    "I am far from being an expert but wonder if theories get shot down for contradicting some minor tenet of a rival theory, and not for contradicting its own premises or experimental results?"

    Yes, this happens. New ideas are sometimes hard to accept because they contradict an old dogma. My series of blog posts about "crackpots" who were right provided a number of historical examples. However, these cases are rare, usually the expert consensus is right, but not always.

    • [deleted]

    The general thinking is that above the symmetry recovery energy, where the Higgs condensate produces particles, physics should exhibit conformal scaling up to the string length. So in a nutshell the physics we observe at the 10 TeV range should then be generically similar to physics up to the string length scale. Further, since the amplitudes scale logarithmically with the energy |A|^2 = probability ~ g^2/4πlog(Λ/E), for Λ = cutoff in scale ~ string tension, the departures at our modest LHC energy are not terribly significant from much higher energy.

    Of course nature could pull little tricks on us, where things radically change at some energy between E_s and E_{LHC}. It is my hope that further into this century we become good enough at cosmic ray physics to look at least approximately at 10^3TeV energy particle physics. Yet still nature could have some unobservable change at the 10^{10}TeV scale or so. In that case we are fooling ourselves. Yet if that happens though there might be some signatures of this at the energy scale we can probe.

    In this way we may get indirect measurements of physics involving the holographic principle and Hawking radiation. The recent observation of the fluid-like properties of quark-gluon plasmas is being investigated as a "soft black hole" physics of Hawking radiation. It could also provide signatures for how gluon chains are equivalent to the graviton.

    The one sad fact of life is that our power to measure and observe nature at these most extreme scales is rather indirect or "oblique." Lawrence Krauss mentioned something similar to this last year or back in 2008 with the lament that this could sign the ultimate end to the foundations of physics or cosmology. Yet, there is plenty of wiggle room to work within for now. Not only that, science is not so much about coming to know everything, but knowing about something relatively well.

    Cheers LC

    In 2006, I predicted the perfect fluid based on my C-field theory. It is beyond me why most physicists would rather speculate on unseen entities based on unreal mathematics instead of examining real fields with real Yang-Mills non-linearities that account for most of the real mysteries of current physics, while not requiring any new particles. I suspect it has to do with both education/brainwashing and with the remark made below that "[one] wonders if theories get shot down for contradicting some minor tenet of a rival theory, and not for contradicting its own premises or experimental results?"

    I am also puzzled that so many focus on speculative physics while literally ignoring known facts that are clearly telling us real non-speculative physics. There is clearly as much psychology as physics at play here.

    As for science being "not so much about coming to know everything, but knowing about something relatively well" that may be a technologist's perspective but it is certainly not that of all physicists.

    Edwin Eugene Klingman

    • [deleted]

    Phil "...the expert consensus is right, but not always"

    A consensus could be very wrong - vide a whole gamut of ideas from the phlogiston to you name it . And what about now - is there an expert consensus? From all that I gathers physics is like the Tower of Babel, people talking in different language. Nature cannot be that multilingual, I think it acts in a simple direct way - like your qubits for example.

    • [deleted]

    The geometry and the algebras are purely linked in the sphere.

    Steve

    • [deleted]

    Excellent paper, a work of science and not science fiction. It is nice to read a paper written by someone with a full grasp of current theory that is intended for a general audience and is sufficiently technical to justify its statements, but not so technical as to lose its dramatic story.

    My question is whether you think one could see the appearance of effects analogous to gravity for quantum computers with 4N qubits?

      • [deleted]

      It's certainly an interesting question. I agree that if P=NP the implications for computational algorithms are huge. I am not so sure that it would affect physics in a profound way. The P=NP question is about algorithms which search a large space of possible solutions, so that a solution may take a long time to find, but once found it can be checked quickly. If the universe operates like a computer or quantum computer it is unlikely to be running an algorthim of that type. More likely it will be just constructing solutions to some kind of evolution equation. This could take a lot of computation, but the only way to check the answer is to rerun the whole thing in full. P=NP does not tell you anything about that kind of algorithm.

      In any case, most people believe that N!=NP. It is just too much to hope that a discovery could be made that would make us "almost like gods". It is a question that will be very hard to settle because it says so much about so many different things.

      • [deleted]

      Dear Sir,

      Let us make a one US Dollar bet. LHC will not repeat not find Higg's Boson even after 100 years and hundred upgradations.

      Regards,

      basudeba.

      • [deleted]

      Dear Sir,

      Your post does not address the point raised by us. How do you define "sound mathematics?" What we meant to say is that the "mathematics" of physicists is unmathematical. It is more of a manipulation of numbers to suit one's convenience. A proof of this is the plethora of theories and interpretations that go in the name of Quantum Theory. Most of what is called as "mathematics" in modern science fails the test of logical consistency that is a corner stone for judging the truth content of a mathematical statement. For example, you do not define infinity - it is not a big number. All mathematical operations involving infinity is void. Thus, renormalization is not mathematical.

      The Schrödinger equation was devised to find the probability of finding the particle in the narrow region between x and x+dx, which is denoted by P(x) dx. The function P(x) is the probability distribution function or probability density, which is found from the wave function ψ(x) in the equation P(x) = [ψ(x)]2. The wave function is determined by solving the Schrödinger's differential equation: d2ψ/dx2 + 8π2m/h2 [E-V(x)]ψ = 0, where E is the total energy of the system and V(x) is the potential energy of the system. By using a suitable energy operator term, the equation is written as Hψ = Eψ. The equation is also written as iħ ∂/∂t|ψ› = H|ψ›, where the left hand side represents iħ times the rate of change with time of a state vector. The right hand side equates this with the effect of an operator, the Hamiltonian, which is the observable corresponding to the energy of the system under consideration. The symbol ψ indicates that it is a generalization of Schrödinger's wave-function. The way the equation has been written, it appears to be an equation in one dimension, but in reality it is a second order equation signifying a two dimensional field, as the original equation and the energy operator contain a term x2. The method of the generalization of the said Schrödinger equation to the three spatial dimensions does not stand mathematical scrutiny. A third order equation implies volume. Addition of three areas does not generate volume and neither x+y+z ≠ (x.y.z) nor x2+y2+z2 ≠ (x.y.z). Thus, there is no wonder that it has failed to explain spectra other than hydrogen. The so-called success in the case of helium and lithium spectra gives results widely divergent from observation.

      The probability calculations needed to work out the chance of finding an electron (say) in a particular place at a particular time actually depend on calculating the square of the complex number corresponding to that particular state of the electron. But calculating the square of a complex variable does not simply mean multiplying it by itself since it is not the same as a real number. Instead, another variable, a mirror image version called the complex conjugate is considered, by changing the sign in front of the imaginary part (if it was + it becomes - and vice versa). The two complex numbers are then multiplied together to give the probability. This shows that, truly it is not squaring, but a mathematical manipulation as the negative sign implies physical non-existence of the second term like the physical non-existence of a mirror image. If A has 5 apples and he gives it to B, then only B has those five apples and A is said to have -5 apples to signify his ownership of the five apples physically with B. Similarly, the mirror image does not make two objects, but only one real object and the other physically non-existent image. This is not mathematics, as mathematics deals with numbers, which is a characteristic of physical objects. Similarly, mathematically all operations involving infinity are void. Hence renormalization is not mathematical. The brute force approach where several parameters are arbitrarily reduced to zero or unity is again not mathematical, as the special conditions that govern the equality sign for balancing cause and effect are ignored. The arbitrary changes change the characteristic of the system. If we treat the length of all fingers as unity, then we cannot hold an object properly. There are innumerable instances of un-mathematical manipulation in the name of mathematics.

      The requirement that all fundamental theories be presented within a concise mathematical framework virtually prevented serious theoretician from ever considering a non-field theory because of its mathematical complexities. The "mathematics" involved in field theories to describe events are simple and concise when compared with the "mathematics" of the same event in non-field terminology. Non-field theories are denied serious consideration because they cannot be given a precise mathematical description. Even if someone was able to develop a precise set of non-field equations, they would likely be so complex, mystifying and un-mathematical that only few mathematicians would be able to understand them.

      Kindly clarify the points raised above.

      Regards, basudeba

      • [deleted]

      You are not alone in being skeptical about the Higgs Boson, some high profile physicists are too. Stephen Hawking has predicted its non-existence and Veltman whose Nobel prize was given for work related to the Higgs Boson is said to not believe in it.

      If the Higgs boson is not there then the LHC will still tell us that because there are limits to how high its mass cam be. Not finding the Higgs Boson therefore still counts as finding "something new". It would tell us that the Standard Model is wrong in a way we least expected. There are a few theories for how that might happen but none of them are very good.

      I don't think it is a very likely outcome but I am not one for making bets.

      • [deleted]

      Dear Sir,

      Please pardon us for telling so. But your post is highly optimistic while side-stepping the main issues.

      We do not subscribe to string theory in its present form. We do not believe Higg's boson or graviton will ever be discovered. But as we have hinted in our essay, we have developed an alternative model from fundamental principles, which give the following testable predictions:

      1. The accepted value of the electric charge of quarks contains an error element of 3%. In stead of +⅔ and -⅓, it should be +7/11 and -4/11. Thus, taking the measured charge of electrons as the unit, the value of the electric charge of protons is +10/11 and that of neutrons -1/11. The residual negative charge is not apparent as negative charge always confines positive charge and flows towards the concentration of positive charge - nucleus. Hence it is not felt outside. It is not revealed in measurement due to the nature of calibration of the measuring instruments. This excess negative charge confines the positive charge (nearly 2000 times in magnitude) which is revealed in atomic explosions. Charge neutral only means the number of protons and electrons are equal.

      2. The value of the gravitational constant G is not the same for all systems. Just like the value for acceleration due to gravity g varies from position to position, the value of G also varies between systems. Gravity is not a single force, but a composite force of seven that act together separately on micro and the macro systems. Only this can explain the Pioneer Anomaly, which even MOND has failed to explain. Similarly, it can explain the sudden change of direction of the Voyager space crafts after the orbit of Saturn and the Fly-by anomalies.

      3. The value of the fine-structure constant α that determines the electromagnetic field strength as calculated by us theoretically from our atomic orbital theory is 7/960 (1/137) when correlated to the strong interaction (so-called zero energy level) and 7/900 (1/128) when correlated to the weak interaction (80 GeV level). There are 5 more values that determine the structure of the orbitals in the atomic spectra. Hence the physically available values of the s orbitals (principal quantum number) are restricted to n = 7, though theoretically, it can have any positive integer value.

      4. There is nothing like Lorentz variant inertial mass. It has never been proved.

      5. We do not subscribe to the modern view of fields. We believe in only two types of fields hinted in our essay.

      Regards,

      basudeba.

      • [deleted]

      Many Thanks Sir,

      basudeba

      • [deleted]

      Kilgore, thank you for the compliments.

      It's an interesting question. The sober answer is that a quantum computer running a program that simulates the universe would show gravitational effects in the same way that a classical computer simulating Newtonian gravity does, but is there a deeper answer?

      It is possible that some class of algorithms with high complexity running on quantum computers might show a collective behaviour whose universality class is described by string theory. That would be beyond what I suggest in this essay, but not much beyond stuff I wrote in the past about the "Theory of Theories". Perhaps that is the kind of idea you are hinting at?

      If the choice of vacuum is analogous to the program for the quantum computer of the universe then it can be described using 200 bytes (I am assuming the current estimate of 10500 vacua is correct ). It does not sound like a very complex program, even if the information in the program is highly compressed. Of course the "data" consists of many more bits so collective behaviour is still a possibility.

      • [deleted]

      Part of the problem for me is that you're "going inside" the qubit and assuming (because the mathematics of ST encourages one to do so) layers of information and information processing which we simply cannot detect in physical reality.

      A guy I profoundly respect, Hans C. von Baeyer, gets rhapsodic about the qubit. He sees a peeled grape, translucent, shimmering, pregnant with mystery, possibly comprising a literal microcosm. It's all there! Then you measure the thing and, as he says, "All you get is one lousy bit. It's such a waste."

      Of course it is. But he's of the IQOQI school and realizes that what you get is what you get.

      • [deleted]

      It is likely that the landscape problem is NP. The possible configurations on the landscape determine different actions, and the Euclideanized path integral or partition function e^S is similar to the problem of ordering the set of possible microstates uniquely instead of coarse graining them up. The attempt at a proof P!=NP by Deolalikar uses thermodynamic or stat mech arguments. I am not aware of the status of this proof at this time, though I think people did find problems with it.

      The quantum computer does not make P = NP, but rather quantum computers solve bounded polynomial problems in PSPACE that most likely do not intersect the NP set of problems. I have thought that generalizations such as the Tsirelson bound and the PR box would maybe solve NP complete problems. However, this is a problematic structure --- though it might play a role in quantum gravity. Quantum mechanics has this Fubini-Study metric for the fibration of the Hilbert space with the projective Hilbert space. This results in the Berry phase and the uncertainty principle, which gives rise to the nonlocal properties of QM.

      With respect to the landscape problem, which might be NP, the grand quantum computer is probably some pure state with a very small summation of eigenmodes. After all string theory has far few degrees of freedom than LQG, and strings as quantum bits means the actual computation space is very small. So while we may observe from some local perspective that these problems seem immense, in fact the total number of degrees of freedom is actually very small and only appears large because we are observing nature partially in some entanglement. By T-duality the number of modes corresponds to winding numbers on compactified spaces, such as Calabi-Yau manifolds. However, the singular points or cusps on those spaces may just be entangled states within a quantum computation which transforms between these spaces as conifold maps --- quantum conifolds!

      There is a hierarchy of problems, which leads all the way to the undecidability problems of Turing's Halting problem and Godel's proofs. The matter of P != NP are lower on that hierarchy. The matter of "being as gods" would to my mind be a time where there is some unification of mathematics which removes the barriers to Hilbert's 23 problems presented by Godel. I doubt that will happen.

      Cheers LC

      • [deleted]

      "The quantum computer does not make P = NP, but rather quantum computers solve bounded polynomial problems in PSPACE that most likely do not intersect the NP set of problems."

      Those would be resource-intensive problems that classical computers could in theory solve also but not efficiently. So I believe you're right. The consensus is that a quantum computer is (or would be) a seriously souped-up classical computer. The fact that NP is in PSPACE means (I think) that individual NP problems are "solvable" by brute force iteration, oracular relativization, magic coins, processes that can yield recognizable solutions but not specific compressed information in the sense of algorithms that you can apply to the next problems you confront, much less use to reverse engineer the universe. The fact that P is in NP is trivial, one is told, but surely nothing is more interesting or important or nontrivial than whether or not NP might be P. If that's the case then all information is (or ought to be) compressible. We WOULD be almost like gods. (Aaronson's point there is that he doesn't believe it'll happen ... that's the NP-hardness assumption. Which frankly seems only reasonable.)

      Of course the P/NP problem has a precise mathematical formulation and could only be resolved mathematically. Some math that would be. But that doesn't mean it carries no ontological implications. Gödel inferentially touched on that in his letter to von Neumann in the 1950's (unearthed in the 90's after the problem had already been stated). It's worth reading, and Gödel explicitly notes the Entscheidungsproblem:

      http://rjlipton.wordpress.com/the-gdel-letter/

      • [deleted]

      Most readers could probably spot the space, but:

      http://rjlipton.wordpress.com/the-gdel-letter/

      • [deleted]

      I must confess that I wasn't thinking about a "Theory of Theories". My only thought at the time of the question was whether we would expect a quantum computer to produce results that include effects of gravity as an intrinsic output. For instance, if we were able to have sufficient precision, would a simulation of H2 using a quantum computer have gravity effects including into output even if the specific quantum algorithm was not designed to include those effects? It would seem that if we are to link spacetime to entanglement then we can not remove effects of QG without an appropriate correction (not sure if that is simular to the correction codes you are referring too).

      Theory of Theories idea is interesting, and I would offer that one unifying concept in a theory of theories is that of ordering. Any non constant variable using any set of values that can correspond to numbers can be placed in some order. A cumulative sum of the values of the variables will always have some curvature (possibly none).

      I think the idea of understanding vacua as programs is interesting. The notion that there is a code for the vacua is also interesting. I have to admit I didn't think about it along those lines until reading your article. I agree that a 200 byte program is not particularly complex, and certainly the set of meaningful programs can only be addressed by understanding the language or semantic problem associated with communication theory. To that end I can only offer the suggestion that its a question of the effectiveness of the information in the program. In that sense, we should think that there should be some language that maximizes the effect of the program in question, and it would seem that if we know that language, we could understand better what choices of programs are possible. In some sense we may need to look at approaches that maximize redundancy. I am not sure how far that treads into anthropic notions, where the observer in effect is somehow choosing the language and program that makes themselves possible, but again, we have to remember that the universe is what is ultimately observing itself, so it isn't really a question of human perception.

      Just my thoughts.