• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Sally
  • Shaw
  • University College London
  • UK

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Mandeep
  • Gill

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Byron Jennings | TRIUMF | Canada

View Blog | Read Bio

Higgs versus Descartes: this round to Higgs.

René Descartes (1596 – 1650) was an outstanding physicist, mathematician and philosopher. In physics, he laid the ground work for Isaac Newton’s (1642 – 1727) laws of motion by pioneering work on the concept of inertia. In mathematics, he developed the foundations of analytic geometry, as illustrated by the term Cartesian[1] coordinates. However, it is in his role as a philosopher that he is best remembered. Rather ironic, as his breakthrough method was a failure.

Descartes’s goal in philosophy was to develop a sound basis for all knowledge based on ideas that were so obvious they could not be doubted. His touch stone was that anything he perceived clearly and distinctly as being true was true. The archetypical example of this was the famous I think therefore I am.  Unfortunately, little else is as obvious as that famous quote and even it can be––and has been––doubted.

Euclidean geometry provides the illusionary ideal to which Descartes and other philosophers have strived. You start with a few self-evident truths and derive a superstructure built on them.  Unfortunately even Euclidean geometry fails that test. The infamous parallel postulate has been questioned since ancient times as being a bit suspicious and even other Euclidean postulates have been questioned; extending a straight line depends on the space being continuous, unbounded and infinite.

So how are we to take Euclid’s postulates and axioms?  Perhaps we should follow the idea of Sir Karl Popper (1902 – 1994) and consider them to be bold hypotheses. This casts a different light on Euclid and his work; perhaps he was the first outstanding scientist.  If we take his basic assumptions as empirical[2] rather than sure and certain knowledge, all we lose is the illusion of certainty. Euclidean geometry then becomes an empirically testable model for the geometry of space time. The theorems, derived from the basic assumption, are prediction that can be checked against observations satisfying Popper’s demarcation criteria for science. Do the angles in a triangle add up to two right angles or not? If not, then one of the assumptions is false, probably the parallel line postulate.

Back to Descartes, he criticized Galileo Galilei (1564 – 1642) for having built without having considered the first causes of nature, he has merely sought reasons for particular effects; and thus he has built without a foundation. In the end, that lack of a foundation turned out to be less of a hindrance than Descartes’ faulty one.  To a large extent, sciences’ lack of a foundation, such as Descartes wished to provide, has not proved a significant obstacle to its advance.

Like Euclid, Sir Isaac Newton had his basic assumptions—the three laws of motion and the law of universal gravity—but he did not believe that they were self-evident; he believed that he had inferred them by the process of scientific induction. Unfortunately, scientific induction was as flawed as a foundation as the self-evident nature of the Euclidean postulates. Connecting the dots between a falling apple and the motion of the moon was an act of creative genius, a bold hypothesis, and not some algorithmic derivation from observation.

It is worth noting that, at the time, Newton’s explanation had a strong competitor in Descartes theory that planetary motion was due to vortices, large circulating bands of particles that keep the planets in place.  Descartes’s theory had the advantage that it lacked the occult action at a distance that is fundamental to Newton’s law of universal gravitation.  In spite of that, today, Descartes vortices are as unknown as is his claim that the pineal gland is the seat of the soul; so much for what he perceived clearly and distinctly as being true.

Galileo’s approach of solving problems one at time and not trying to solve all problems at once has paid big dividends. It has allowed science to advance one step at a time while Descartes’s approach has faded away as failed attempt followed failed attempt. We still do not have a grand theory of everything built on an unshakable foundation and probably never will. Rather we have models of widespread utility. Even if they are built on a shaky foundation, surely that is enough.

Peter Higgs (b. 1929) follows in the tradition of Galileo. He has not, despite his Noble prize, succeeded, where Descartes failed, in producing a foundation for all knowledge; but through creativity, he has proposed a bold hypothesis whose implications have been empirically confirmed.  Descartes would probably claim that he has merely sought reasons for a particular effect: mass. The answer to the ultimate question about life, the universe and everything still remains unanswered, much to Descartes’ chagrin but as scientists we are satisfied to solve one problem at a time then move on to the next one.

To receive a notice of future posts follow me on Twitter: @musquod.


[1] Cartesian from Descartes Latinized name Cartesius.

[2] As in the final analysis they are.

Share

Tags:

3 Responses to “Higgs versus Descartes: this round to Higgs.”

  1. rufus warren says:

    Or mass is based upon a single distribution of change bundle stability based on objects response to our planet ; what else is wrong with this picture?

  2. Rufus Warren says:

    Axiom 1: there exist discrete charged particles with values e[+1, 0, -1, +2/3, -1/3, ?]
    Axiom 2: the speed of the wave-front of light is the same for all observers independent of the light emitter

    or

    Axiom 3: values e[+/- 1]
    Axiom 4: the speed of the wave-front vector varies from -infinity to +infinity.

    with axioms 3&4 the speed of cosmic galaxies can be inferred using two possibilities with the only certain value of distance between repeated events divided by the time between repeated events or EM waves.

    The other possibilities along the two other dimensions of the velocity vector may be calculated from the matter distribution in space. Plus or minus value of the wave-front speed vector as moving into or away from us only describes a one dimensional measurement with an uncertainty of direction. Using an axiom that each transmitted wave moves at speed c then we may adjust distance to objects based on the data timed over a proper period. So the question is does there exist a single solution for the velocity distribution of space objects based on a fact that multiple charge clusters attract each other. To be determined if we have enough sufficient and necessary evidence to describe motion in space.

    A hint would be moving into and object directly should be hotter than any other direction if all directions have similar matter distributions. Expect at least one spatial direction to be hotter than others and at least one cooler direction.

    Hence each measurement may have 2 vales: +/- Vw = +/-c lambda emitted/lambda observed, speed of measured wave-front. hence a computer program to take our measurements, corrected, redefine the velocities of intergalactic motion based this physics.

    or used the present implications that charge may vary from -1e through +1e with light speed a constant, but this hunt would be better with a simulator other than a smasher. The smasher should verify not locate.

    Idea: Create a simulation of the proton flow using Maxwell to describe the interactions and again the standard model. Hence does the distribution on measurements depend upon the spot profile of the beam and manner of splitting the beam. Or are we just establishing a mode and giving it a name? Can this be visualized mentally with a smasher, doesn’t seem to be a difficult problem to simply visualize the energy distribution?

    Think our predictability based upon our physics should define certainty, not a redefinition of the theories expected values. Stochastic processes are indications of truth and fantasy, depending upon interpretation.

    I’m still wandering what kind of optical misconnection can create a delay greater than one meter and not be noticed that it is floating in air.

  3. Gavin Flower says:

    From what little I understand of M-Theory, there may be be disjoint parts of the Universe with different values for one or more of the values we generally consider constant. Note that M-Theory has time plus the usual 3 space dimensions that we think we are comfortable with, with 6 additional space dimensions that are tiny and wrapped up, plus another dimension that I’m not sure how to describe. One of the problems is that the topology of how all these dimensions interact is not known for certain.

    Note that the ‘strings’ in M-Theory are thought to be about the Plank Length in size – so very roughly, the ratio of the size of a string to a proton is the ratio of a tennis ball to the size of our galaxy! Remember how big the LHC is? Using known physics, you’d need a machine bigger than our galaxy to ‘see’ things the size of one of these strings – as far as I can tell. So we can only test M-Theory indirectly. I get the impression that experimentally, Super Symmetry is far easier to test for (but I could be wrong). The vibrations of the ‘things’ in M-Theory and its topology, relate to the nature of fundamental particles and the various force carriers – the graviton is a natural consequence of M-Theory .

    One of the criticisms of M-Theory, is that it does not appear to have just one possible solution, which some people object to on the basis they insist that they feel very strongly (often an actually belief – an arbitrary statement that something is TRUE). However, even if there are more than one solution, we may never have the means of practically determining whether other ones actually exist. Assuming at least on other solution ‘exists’, what do mean by it existing? Does it exist next to us: if so, is it spacewise or time wise or … The mind boggles, at last mine does!

    M-Theory blurs the distinction between Philosophy and Science. Antagonists claim it is not Science, and Proponents do not claim that it is the last word. One of the problems that it is so mathematically difficult, that we can only deal in very poor approximations, so coming up with meaningful experimentally verifiable predictions is decidedly non-trivial and positive results are not definitively conclusive.

    M-Theory is very challenging in a philosophical sense, and is very difficult scientifically (to put it mildly). However, I think it is the best we have to explain the nature of things, and its difficulties should not be used to discredit it. I’m sure most Proponents of M-Theory would love to have a new theory that was far more tractable, yet explained things as least as well!

Leave a Reply

Commenting Policy