• Ultimate Reality
  • Classical Spheres, Division Algebras, and the Illusion of Quantum Non-locality:

Hi Fred,

Along this line -- I think the following post of mine was deleted from now closed "Disproofs" thread, and I thought I had not saved it to an offline file. I just found that I did save it though; I am posting again because I think it makes a critical point about seeing what one only expects to see:

"Joy,

As confounding as it is, I think we should be charitable with Weatherall and the esteemed editors of the publication. They are only following a long-entrenched premise of quantum mechanics that I have come to recognize is exceedingly easy to accept and impossible to falsify:

"Before I say what the model is, let me motivate it a little. The experiment whose outcomes we are trying to reproduce involves spin measurements. Quantum mechanical spin does not have a direct classical analogue, but the spin operators satisfy the same algebraic relations as rotations in three dimensional Euclidean space. This is suggestive: one might take it to imply that the right way of thinking about spin is as some sort of rotation. Following this idea, one might reason that when an experimenter measures a particle to be spin up or spin down about a given vector, she is actually measuring the orientation of the body's rotation." Weatherall, p. 8

This justifies a later assumption: "A and B are explicitly deterministic, since given a value for the hidden variable, and given a choice of measurement vector, the observables take unique, determinate values. And they are local, in the sense that the value of the observable A is independent of Beta_a, and B is independent of Alpha_a.

"We assume an isotropic probability density over the space of hidden variables. This density is simply a map rho: Lambda --> [0, 1] that assigns equal probability to both cases."

I cited the paragraph immediately above in my earlier criticism of the Weatherall proposition. The unfalsifiable premise is the "equally likely" hypothesis. The rub is, that if one imposes the equally likely hypothesis on a theory of continuous measurement functions, one cannot avoid getting what Weatherall calculates for values of A and B (= 0) " ... on average, neither body is rotating about any direction. This result is consistent with the quantum mechanical expectation values and with experiment."

It certainly is. This isn't more than question-begging, though. If the equally likely hypothesis holds over the space of hidden variables, then the result is unavoidable. That is not a scientific conclusion, in the sense of correspondence between two independent variables. The variables are dependent on the assumption of a probabilistic measure space.

What has always frustrated the hell out of me, is that even when your critics KNOW that your proposition is not based on an assumption of probability, they insert a probability space into their criticisms anyway. They don't consider it nonsense, because they mistake probability laws for physical laws. To use that classical example of Kepler orbits again, it's as if, that until one measures the curve between t and t' no orbital trajectory exists. They neglect the fact that if this were true, we would be able to measure the trajectory in a direction *different* from the one it naturally lives in. Quantum mechanics assumes no continuous map t --> t'.

If one is immersed in analysis and topology, one knows that orientiability is not a product of observer orientation; rather, orientation is that which the topology chooses. Yes of course one can beg the question and assume " ... spin operators satisfy the same algebraic relations as rotations in three dimensional Euclidean space," but that doesn't really mean anything -- for if one simply accepts a probabilistic measure at every t = 0 averages over T to zero (as it must, by the rules of arithmetic), one gets only what one assumes. Give me any purported proof of Bell's theorem (and I have examined a great many of them -- Herbert, Gill, Motl, Aaronson, et al) and I can explicitly show that the proof is nonconstructive, by double negation. Why should one be bothered by that? -- well, if the world were only as it appears, then a proof of that proposition by double negation only proves the null value of double negation itself, and no new knowledge emerges.

Your framework is in the true sense of science as expressed by Bronowski -- a "search for unity in hidden likenesses" -- the unity of quantum curvature with the curvature at the beginning and end of the universe.

Frankly, I'm happy that Weatherall's critique has been elevated to "official" status. Most of the criticisms of your proposition, notably Moldoveanu's and Vongehr's, are simply childishly naive. Weatherall manages to actually grasp the problem, and his wrong direction in solving it promises to reopen the real debate between EPR and Bell. We might once more be able to speak of scientific falsifiability with the credibility it deserves.

All best,

Tom"

Hi Tom,

They say in Hollywood here that any publicity is good publicity. So in a way that is true that the real debate is only getting started. It is really too bad though that the playing field for the debate is not level.

The notion that QM spin has no classical analogue is another reason Joy gets so much push back. Thus that notion also fits the quote from Maugham. A classical analogue is easy with parallelized 3-sphere topology.

Best,

Fred

Hi Fred and Tom,

Thanks for your comments. Here is a well known story you might enjoy:

For five years, from December 1903 to September 1908, two young bicycle mechanics from Ohio repeatedly claimed to have built a heavier than air flying machine and to have flown it successfully. But despite scores of public demonstrations, affidavits from local dignitaries, and photographs of themselves flying, the claims of Wilbur and Orville Wright were derided and dismissed as a hoax by Scientific American, the New York Herald, the US Army and most American scientists. Experts were so convinced, on purely scientific grounds, that heavier than air flight was impossible that they rejected the Wright brothers' claims without troubling to examine the evidence. It was not until President Theodore Roosevelt ordered public trials at Fort Myers in 1908 that the Wrights were able to prove conclusively their claim and the Army and scientific press were compelled to accept that their flying machine was a reality. In one of those delightful quirks of fate that somehow haunt the history of science, only weeks before the Wrights first flew at Kittyhawk, North Carolina, the professor of mathematics and astronomy at Johns Hopkins University, Simon Newcomb, had published an article in The Independent which showed scientifically that powered human flight was 'utterly impossible.' Powered flight, Newcomb believed, would require the discovery of some new unsuspected force in nature. Only a year earlier, Rear-Admiral George Melville, chief engineer of the US Navy, wrote in the North American Review that attempting to fly was 'absurd'. It was armed with such eminent authorities as these that Scientific American and the New York Herald scoffed at the Wrights as a pair of hoaxers.

In January 1905, more than a year after the Wrights had first flown, Scientific American carried an article ridiculing the 'alleged' flights that the Wrights claimed to have made. Without a trace of irony, the magazine gave as its main reason for not believing the Wrights the fact that the American press had failed to write anything about them: "If such sensational and tremendously important experiments are being conducted in a not very remote part of the country, on a subject in which almost everybody feels the most profound interest, is it possible to believe that the enterprising American reporter, who, it is well known, comes down the chimney when the door is locked in his face -- even if he has to scale a fifteen-storey skyscraper to do so -- would not have ascertained all about them and published them broadcast long ago?"

Hi Joy and Fred,

Now that's a familiar story! LOL!

Thanks. I was going to compose something about having to wait for the mathematical understanding of already known theorems to catch up to Joy's model -- and this is a perfect example. Bernoulli's principle of fluid flow was known for about 150 years before the Wright Brothers, which should have suggested to informed scientists that heavier than air flight is far from impossible.

A hundred years of topology research should tell informed scientists today that local realism does not necessarily depend on nonlocal events.

All best,

Tom

Just to add another fact to puncture the mythology:

The SA article of 1905 claims "If such sensational and tremendously important experiments are being conducted in a not very remote part of the country, on a subject in which almost everybody feels the most profound interest, is it possible to believe that the enterprising American reporter, who, it is well known, comes down the chimney when the door is locked in his face -- even if he has to scale a fifteen-storey skyscraper to do so -- would not have ascertained all about them and published them broadcast long ago?"

I can testify from first hand experience as a journalist that this is not only possible to believe, it is exactly that reporters are pack animals catering to narrow corporate interests that *ensures* the suppression of new ideas in the popular press.

It is the major reason I abandoned my successful broadcast journalism career, and I can further testify that the field has not only not improved since 1974, it has gotten far worse.

That's why I admire some courageous freelancers here like Zeeya Merali. They fan the dying ember of what hope I have left.

All best,

Tom

To try and get a new technical discussion started ...

Suppose we contrast the Christian correlation function C_ab with the CHSH correlation function.

The former is deterministic, admitting random input; the latter is probabilistic, producing random output.

Could these functions be dual to each other? -- if so, it venerates Joy Christian's claim that entanglement is an illusion, without obviating the computability and mathematical validity of Bell-Aspect experimental results. Discrete output of Bell-Aspect -- on the assumption of entanglement -- would be identical to truncated output, from a continuously randomized input function (the ThrowDie function in Chantal Roth's programming terms) of Joy Christian's model.

See how the contrast makes duality possible? -- Bell-Aspect simply assumes entanglement, while the Christian framework follows Newton's prescription, "hypotheses non fingo." Entanglement is superfluous to deterministic, continuous, natural (and locally real) functions.

What makes Bell's theorem important to computer controlled applications, particularly security, is the assured pseudo-random integrity of the ouput, based on the assumption of quantum entanglement in which nonlocal results remain out of reach of computation by an adversary who has no knowledge of how the pseudo-random string was obtained.

Representative of such infomration technology is this 2010 paper by Stefano Pironio, et al:

"We quantify the Bell inequality violation through the CHSH correlation function [19]

I = SIGMAx,y (- 1)^xy [P(a = b|xy - P(a != b|xy]

where P(a = b|xy) is the probability that a = b given settings (x; y) and P(a != b|xy] is defined analogously. Systems that admit a local, hence deterministic [20], description satisfy I =< 2. Certain measurements performed on entangled states, however, can violate this inequality."

A measurement on an assumed entangled state, however, is an average of trials on a probabilistic space. A deterministic measure on a complete -- i.e., classically continuous -- space, is an arbitrary choice of boundary conditions on an arbitrary interval. In other words, does nonlinear input to a continuous function guarantee precisely correlated linear output?

To prove the above question in a positive way, we need an arithmetically continuous model computable from a discrete initial condition. I've been working on a proof, the strategy outlined here and here .

Dual quantum correlation functions that generate identical results -- one function assuming non-observable quantum entanglement; the other, a correlation measurement continuous from the initial condition -- tells us that entanglement is an artifact of an artificial probability space and not physically real.

What justifies the probability measure space? -- only the assumption that fundamental reality is probabilistic. This assumption is made apparent in Poronio et al's conclusion:

"Stepping back to the more conceptual level, note that Eq. (3) relates the random character of quantum theory to the violation of Bell inequalities. This bound can be modified for a situation where we assume only the no-signalling principle instead of the entire quantum formalism (see Figure 2 and 3 and Appedenix A.3). Such a bound lays the basis for addressing in a statistically significant way one of the most fundamental questions raised by quantum theory: whether our world is compatible with determinism (but then necessarily signalling), or inherently random (if signalling is deemed impossible)."

The no-signalling condition is a red herring; Joy's framework is completely compatible with the no-signalling condition -- it couldn't *not* be, in that it is fully relativistic. Non-relativistic quantum theory gets "inherently random" nonlocal results on the assumption that locality forbids signalling -- which only assumes what it means to prove. Joy's framework generates manifestly local quantum correlations that still forbid signalling, a la special relativity.

Tom

    Another way to argue the point is to take Lucien Hardy's five reasonable axioms of quantum theory . Hardy uses the first four axioms to rule out classical probability. The fifth axiom (continuity) cannot live with the first axiom (probability).

    We would instead drop the first axiom and keep the other four, to rule out quantum entangled probability.

    Tom

    Like Godfrey Hardy (don't know if there's any relation) Lucien Hardy is the kind of mathematician I trust, because he doesn't hedge his bets on observational (or on non-observed) outcomes. I argued with Gill in the past about the importance of theoretical independence from experiment; Gill claims conventional quantum theory *is* independent of experimental results -- and I think Hardy's paper shows clearly that it is *not*, when the assumption of a probability measure is excluded.

    Hardy's axioms subheaded 1) Probabilities; 2) Simplicity; 3) Subspaces; 4) Composite systems; 5) Continuity are consistent with Joy Christian's framework when axiom (1) is dropped.

    Though I agree with Hardy (6.1, p 10) that the frequency interpretation of probability measure theory is to be preferred over Bayesian analysis (which I admittedly disdain in any form) -- I have to ask why the state (6.2) is assumed independent of its initial condition, as experimental preparation of the state implies (the experimenter is an element of the quantum system). A continuous function without a probability measure would not normalize the state vector; the function would give up information on the state of the system independent of state preparation.

    (Correcting a misstatement in the previous post: I meant to say that Hardy rules out classical probability by adding the fifth axiom, while the first four support both classical and quantum probability.)

    Tom

    12 days later

    Hi Tom,

    Thanks for your comments.

    Just to let you know: I have revised my latest paper slightly. It now includes a new footnote on pages 18 and 19, and a new figure on page 22 (please see the attached paper).

    It is also worth noting here that the explicit, event-by-event, Java simulation of my model written by Chantal Roth has now been independently verified and reproduced by at least two other investigators, writing codes in two entirely different programming languages. Austin Smith has reproduced the simulation using Excel Visual Basic language, and John Reed has done the same using Mathematica (which, by the way, is an "interpreted" rather than "complied" programming language).

    This leaves no doubt about the validity of my local model for the EPR-Bohm correlation, or of Chantal Roth's simulation of it (which, as you know, is discussed in the attached paper).

    Best,

    Joy

    Image 1

    Image 2Attachment #1: 17_whither.pdf

    Thanks, Joy!

    That is extremely good news, for a machine language that does not rely on a compiler to translate code to execution cannot be accused of rigging a result. The source code is executed on the spot and no information lost in translation.

    So it cannot but be the case -- in the real time execution of random input to a function continuous from an initial condition, that correlation of two dichotomous variables of the function recorded on intervals that are indifferent to the outcome of a coin-toss probability -- is manifestly local.

    I'm celebrating with you, here on the other side of the pond.

    And all this from the simple assumption of the choice of topology! You might consider carving C_ab into a wooden bridge. :-)

    You know, it occurred to me, the historical importance of the number 5 in a system of complete axioms -- Euclid's postulates; Dedekind-Peano arithmetic axioms; Lucien Hardy's five simple axioms of quantum mechanics -- and how replacing just one assumption with another changes the whole game. Such was the case of replacing Euclid's fifth postulate with one or the other axioms of non-Euclidean geometry, that made Riemannian geometry viable for general relativity. Replacing Hardy's probability axiom with a topological postulate would seem to form a complete axiomatic basis for your framework and put it a giant step closer to a coherent and falsifiable theory.

    How would that postulate be worded?

    And one more thing I'd like to know: where are the critics who said this couldn't done?

    All best,

    Tom

    "...a machine language that does not rely on a compiler to translate code to execution cannot be accused of rigging a result. The source code is executed on the spot and no information lost in translation."

    Did someone program the simulation in machine language?

    James Putnam

    James, all computer simulations are programmed by a set of instructions that are translated into a machine language native to the computer on which the instructions are executed.

    The difference between a compiled set of instructions and an interpreted set is slight; both are what is called Turing-complete. Both are efficient. Instructions that use compilers take a comparatively longer time to write and a comparatively shorter time to execute; interpretative instructions take a comparatively shorter time to write and a comparatively longer time to execute. It all comes out in the wash; some methods are better adapted to specific tasks than others.

    I suggest that to simulate a continuous function such as Joy's, with two randomly fluctuating (dichotomous) variables, the interpretative method reduces the chance that the random function (called ThrowDie in Chantal Roth's simulation) can be corrupted in the compilation of code. Though I can't speak as an expert -- perhaps one will show up.

    No computer actually computes a continuous function -- computer code is digital. A differential equation, e.g., is converted to a difference equation before being executed in a program. The function can be made arbitrarily smooth.

    (cf., the iteration of an n-sided-gon into the approximation of a circle. Or imagine that the randomly thrown straight lines of uniform length -- by which one can calculate curvature to arbitrary accuracy, by a Monte Carlo algorithm on a grid -- are not bound by grid lines, and nevertheless generates a beautiful regular curve, such as the sine wave of Joy's model, in a coordinate free manner. Are we looking at the wave function of the universe? If such a wave lives in the space of all possible correlated points of a parallelized 3-sphere, we need postulate neither wave function collapse nor nonlocality, to get the same strong quantum correlations as predicted by quantum theory.)

    Tom

    • [deleted]

    Tom,

    Ok, I wasn't clear on the meaning of your statement. I looked at code that I was aware of being posted, and, didn't see rows of one's and zero's. I expected that such a task was far to monumental to undertake. Since I am not an expert, I asked anyway.

    James Putnam

    Hi Tom,

    Let us not forget that

    (1) Professor Scott Aaronson claimed on his blog that the correlation predicted by my model is always constant and equal to -1.

    (2) The exquisitely qualified FQXi panelist whose report I have seen claimed that the correlation predicted by my model is always constant and equal to -1.

    (3) Professor Richard Gill claimed on these pages that the correlation predicted by my model is always constant and equal to -1.

    (4) Professor James Owen Weatherall claimed in his paper that the correlation predicted by my model is always constant and equal to -1.

    (5) A distinguished Editorial Board Member of Physical Review claimed in her report that the correlation predicted by my model is always constant and equal to -1.

    (6) Several less-than-distinguished critics of my work also claimed that the correlation predicted by my model is always constant and equal to -1.

    (7) I was declared a completely exposed charlatan and a crackpot, and my tiny research funding from FQXi was cut off.

    Now compare the above distinguished opinions with the following opinion of a humble machine:

    Image 1

    Image 2

      Joy, I've spent years now trying to understand why your critics' arithmetic differs from yours. And mine.

      I concluded at some point that because they always assume the quantum probability measure, they deny the continuous function that completes the correlation measure. They never see it -- it never happens. Anything that might have happened is "nonlocal."

      If one computes only on the basis of first order arithmetic, the probability is compelled to collapse to unity, as the wave function of a quantum observation is believed to collapse.

      Introduce second order arithmetic (analysis), and the game changes -- there is no collapse, no nonlocality.

      The error -- just as you always claimed -- is built right into the assumption of what space one is working in. Where first order arithmetic applies, a many-sided die gives one real result with n-results in linear superposition; where second-order arithmetic applies, there is no probability for a linear outcome. The order relation (the primitive binary relation) in second order arithmetic will fluctuate (0,1) (1,0) continuously -- if one is judging this fluctuation by first order axioms, one reasons that because the statement, 0 < 1 is true and 1 < 0 is false on the real positive line R, what is less than 0 (the "distinguished member") is - 1 and mathematically illegal.

      In the analytical case, however, because we are not confined to the space of the real line (topological space S^0), the distinguished member is a complex double zero {0,0} such that a measurement function continuous from the initial condition, assuming the primitive binary relation and nondegeneracy, is either [0, +/- 1] or [+/- 1, 0] in which the closed interval makes the difference between judging results probabilistically on the open interval (- oo, + oo) and finding true deterministic correlation of left and right independent variables on parallelized topological spheres. S^0 is trivial; S^1 has the complex {0,0} but still allows the open interval. Only at S^3 do we encounter a closed manifold suitable for linear independence of the random variables; we know by complex function analysis that the only allowable results on the S^3 equator are + 1, - 1 and i (sqrt -1). We don't even need the complete physical space of S^7 to make the case for this subset of the parallelizable spheres, S^1, S^3, S^7 (for those unfamiliar with topology, the notation means Euclidean spheres of two, four and eight dimensions, accommodating division algebras from the algebraically closed plane to quaternion algebra (S^3) and octonion algebra (S^7).

      Let the critics come forward with counterarguments, if they have them. They certainly weren't shy of expressing their opinions when it wasn't clear if the continuous function simulation of nonlinear input could be programmed digitally (thanks again to courageous Chantal Roth) -- what now? Nothing to say?

      All best,

      Tom

      Joy,

      Congratulations. For your critics to retain credibility they must publicly don the hair shirt. The sign of a good scientist is to admit when they were wrong.

      I also hope you'll be generous and magnanimous in victory, another sign of a good scientist (if you're still alive when the win is noticed!)

      But I'm disappointed to have yet had no response or comment from you on my geometrical analogue of your finding. I have no desire to steal your thunder or give opportunity for others to drag yours down, but I believe important physical insights emerge to consolidate the principle.

      I've had success making the point that Neils Bohr would likely say today that we could update 'what we can say' about the lack of structure of a particle, and test the effects of, for instance, toroidal geometry and chirality. The religious adherence to singlet states alone is now untenable.

      Have you yet researched the findings of an 'orbital asymmetry' of time-resolved single particle correlations predicted in my essay and now found discarded in Alain Aspect's data? (not in his main paper).

      Best wishes.

      Peter

      Hi Peter,

      Thank you for your kind words. I thought we had discussed your ideas before; but perhaps not the latest details you mention. Aspect's experiment has been superseded by many more careful and sophisticated experiments. The state-of-the-art experiments are now moving towards completely loophole-free experiments. All of these experiments confirm quantum mechanical predictions. Therefore your obligation is to reproduce the quantum mechanical predictions. I have not seen any derivations from you, even for the simplest cases. I speak equations. You seem to speak a language that I do not understand.

      Best,

      Joy

      Hi James,

      I better understand what you mean, now. I also wondered whether writing a digital program of a continuous function with nonlinear input would be a "monumental task." The technical question is, "Is such a function algorithmically compressible?"

      Precise numerical implementation of a continuous function isn't possible in principle; however, if we accept the arbitrarily close points of a continuous line (as mentioned, the conversion of a differential equation to a difference equation) is sufficiently smooth -- then the problem remains how to randomize the input such that we know that wherever and whenever we insert on the line a pair of randomly generated dichotomous variables fluctuating between +1 and -1, the discrete choices remain linearly independent of the continuous function.

      In all previous simulations that were proposed to fail, that I saw -- there was no randomized input. The critics simply did not seem to understand that the arbitrary choice of vector by the programmer cannot be counted as the initial condition of a function continuous from the initial condition; they create a dependent condition based on the experimenter's choice -- and then when they don't get Joy's predicted correlations on their assumed probability space -- declare that something is wrong with the mathematics, rather than with their own assumptions about the initial condition and a probability space that isn't there in the first place. They don't grasp how Joy has left the choice of initial condition to nature and taken the experimenter out of it.

      The second worry that I had apart from the algorithmic compressibility of the framework, is that if it should prove simulable, there remains the question -- "is the simulation of a continuous function itself a continuous function?" Only today has my last worry been laid to rest -- having heard of the algorithm being replicated in at least two other machine languages. This is important, because it answers "yes" to the question and obviates any doubt that the model is independent of experimenter bias and the variables are linearly independent.

      Here is a prediction you can quote me on:

      Because Gregory Chaitin has shown that linear arithmetic functions have a built in uncertainty (Chaitin's Omega number is sensitive to the machine language running it), quantum computing algorithms based on linear superposition and quantum entanglement will fail. Preparation of the state vector in the complex Hilbert space biases the function. Better start looking for nonlinear solutions and the arithmetic nonlinearity that powers them.

      Best,

      Tom

      Joy,

      I'm speaking the 3D+1 geometry of non-linear optics. We're observing the same volcano but from opposite sides so see the same but nothing the same!

      I derive the QM prediction geometrically using helical dynamics in the essay, obeying Malus's Law. IQbit; The Intelligent Bit.

      Most recent experiments use 'weak measurement' statistical methods which are 'blind' to the cosine curves as it can't correlate time-resolved individual photon pairs. Now single photon production is more a reality I've proposed a refined Aspect experiment which should reproduce the 'orbital asymmetry' present in

      Tom, Joy,

      You should find compressible continuous functions in helical paths. I've been discussing in an APS Quantum Physics blog. I reproduce a recent post below, and can probably dig out a thesis on the Gottfried-Jackson angle and helical frame if you can't find anything helpful.

      Post; "I haven't found any hint that it may represent a longitudinal component, but studying related experimental results (grazing grounds I've always found useful), does give consistent hints about axial helicity consistent for instance with Jackson-Minkowski rotation viewable from the Gottfried-Jackson frame. The latter is nothing to do with me; but on the 'GJ angle' between the lab frame momentum of the Higgs and an emitted photon in the resonance rest frame. Rotating by the angle alpha on the y-axis gives the helicity rest frame. (alpha differentiates the z-axis in that frame and the approaching photons 3-momentum vector). I'll find some authoritative links on that if you want.

      But translating the gobbldygook into possible physical models with a simplified description, it looks something like the (spin-0 but possibly spin 2!) Higgs may facilitate a marriage between particle and antiparticle into a coherent (massive) and conserved toroidal dynamic by perhaps adding the second supplemetary 'winding' spin of the body. So the primary spin may be considered as the 'ring' rotating (giving the primary helicity on axial translation[motion]), but the torus is held together by the poles (of the dipole) spinning, which produces the twin counter 'winding' of the torus.

      The logical analogies of this are quite wide. For instance spin 1/2 and 1-1/2 can be physically represented by the charges ending up in the same place after only a half integer rotation of the ring. Sure, all circumstantial and highly speculative, but only in a similar way to jigsaw puzzles, and I've tried hard to falsify it and can't. I really thought Bells Theorem would destroy it, but, as you've seen, it avoids Bells (Bohr's 1020's) assumption of singles spin states because it has another couple hidden away!

      The 'no-mirror symmetry' matter is also critical to the EPR case. Do take a look at that aspect and comment, as it seems to act as 'information pre-held' between the entangled particles." [PJ, 3 days ago]

      best wishes

      Peter