• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Byron Jennings | TRIUMF | Canada

View Blog | Read Bio

Knowledge and the Higgs Boson

This essay makes a point that is only implicit in most of my other essays–namely that scientists are arro—oops that is for another post. The point here is that science is defined not by how it goes about acquiring knowledge but rather by how it defines knowledge. The underlying claim is that the definitions of knowledge as used, for example, in philosophy are not useful and that science has the one definition that has so far proven fruitful. No, not arrogant at all.

The classical concept of knowledge was described by Plato (428/427 BCE – 348/347 BCE) as having to meet three criteria: it must be justified, true, and believed. That description does seem reasonable. After all, can something be considered knowledge if it is false? Similarly, would we consider a correct guess knowledge? Guess right three times in a row and you are considered an expert –but do you have knowledge? Believed, I have more trouble with that: believed by whom? Certainly, something that no one believes is not knowledge even if true and justified.

The above criteria for knowledge seem like common sense and the ancient Greek philosophers had a real knack for encapsulating the common sense view of the world in their philosophy. But common sense is frequently wrong, so let us look at those criteria with a more jaundiced eye. Let us start with the first criteria: it must be justified. How do we justify a belief? From the sophists of ancient Greece, to the post-modernists and the-anything-goes hippies of the 1960s, and all their ilk in between it has been demonstrated that what can be known for certain is vanishingly small.

Renee Descartes (1596 – 1960) argues in the beginning of his Discourse on the Method that all knowledge is subject to doubt: a process called methodological skepticism. To a large extend, he is correct. Then to get to something that is certain he came up with his famous statement: I think, therefore I am.  For a long time this seemed to me like a sure argument. Hence, “I exist” seemed an incontrovertible fact. I then made the mistake of reading Nietzsche[1] (1844—1900). He criticizes the argument as presupposing the existence of “I” and “thinking” among other things. It has also been criticized by a number of other philosophers including Bertrand Russell (1872 – 1970). To quote the latter: Some care is needed in using Descartes’ argument. “I think, therefore I am” says rather more than is strictly certain. It might seem as though we are quite sure of being the same person to-day as we were yesterday, and this is no doubt true in some sense. But the real Self is as hard to arrive at as the real table, and does not seem to have that absolute, convincing certainty that belongs to particular experiences. Oh, well back to the drawing board.  

The criteria for knowledge, as postulated by Plato, lead to knowledge either not existing or being of the most trivial kind. No belief can be absolutely justified and there is no way to tell for certain if any proposed truth is an incontrovertible fact.  So where are we? If there are no incontrovertible facts we must deal with uncertainty. In science we make a virtue of this necessity. We start with observations, but unlike the logical positivists we do not assume they are reality or correspond to any ultimate reality. Thus following Immanuel Kant (1724 – 1804) we distinguish the thing-in-itself from its appearances. All we have access to are the appearances. The thing-in-itself is forever hidden.

But all is not lost. We make models to describe past observations. This is relatively easy to do. We then test our models by making testable predictions for future observations. Models are judged by their track record in making correct predictions–the more striking the prediction the better. The standard model of particle physics prediction of the Higgs[2] boson is a prime example of science at its best. The standard model did not become a fact when the Higgs was discovered, rather its standing as a useful model was enhanced.  It is the reliance on the track record of successful predictions that is the demarcation criteria for science and I would suggest the hallmark for defining knowledge. The scientific models and the observations they are based on are our only true knowledge. However, to mistake them for descriptions of the ultimate reality or the thing-in-itself would be folly, not knowledge.

 



[1] Reading Nietzsche is always a mistake. He was a madman.

[2] To be buzzword compliant, I mention the Higgs boson.

Share

Tags: , ,

  • Rick Baartman

    Descartes was quite the guy, it’s true. But he only lived till 1650.

  • Tom

    Sad to see you being rude about Nietzsche, whose writings I’ve loved all my adult life. Sure, he had a mental collapse in his latter years, but his works are among the most thought-provoking of the 19th century, as well as being wonderfully written. To call him a madman is naive and, frankly, unworthy of anyone claiming to understand philosophy. Cheers, and be good,

    Tom

  • Mike Decker

    Knowledge doesn’t have to be justified. Common knowledge that you tend to take for granted, without even being aware of it most of the time. It’s quite possible to divulge, if you’re taking a bit of an interest, but what you’re disclosing isn’t likely to be a shocking revelation. An aspect that’s strived for constantly in the mass media, the advertising industry, particularly. With the scientific community being drawn in, as you acknowledge in your last paragraph. It has to seem au courant, for the sake of a general audience, even if they’re getting kind of bored with the subject.
    And it’s a form of terrorism. There’s a peculiar kind of sanctimonious arrogance associated with currently fashionable controversies, and it isn’t confined to the scientific community. Running scared, if anything, needing to justify healthy salaries by pandering to the misconceptions that might be prevailing. Postulating a mutltiverse of some kind, which continues to go over quite well in some circles, as evidence of a Darwinian winnowing out process, at the expense of the dull witted, presumably.
    Seems harmless enough. If anything, it’s the enthusiasts that will be getting themselves into deep trouble. Within the cloister, if you like, strongly reminiscent of the scientific revolution that animated philosophers like Nietzsche. And his successors as well. The postmodern theorists. Not to mention any number of grab bag New Age atheists wanting to make their revelations seem scientific somehow, as if it’s no longer necessary to have faith. You can choose to believe that you’re living in the realm of illusion.
    Climate change comes to mind, as a vacant obsession. You can never really be too sure about whether it’s a significant feature or not, or about what the contributing factors might be. It’s the heat capacity of the earth, for instance, which will tend to predominate. The heat capacity of the atmosphere is rather negligible, by comparison. An incredibly slow, gradual diffusion process, but well understood. And it may not have been taken into account. And the focus has been mainly on the greenhouse effect, to the exclusion of more obvious balancing factors.
    It’s the enthusiasts that have a lot to answer for, to the extent that they want to turn it into an unacknowledged inquisition of some kind. And it must have originated twenty odd years ago, around the time of the fall of the former Soviet Union. The cold war was over with. A rapprochement had been arrived at, even if it seemed kind of inconsequential. Surprising in light of the chasm that was presumed to have divided the world for decades previously. It wasn’t seen to be a danger, in the sense that the consensus being arrived at might be hugely bogus, for the sake of a select few, a predicament famously parodied by George Orwell. Which kind of leaves you hanging by now. Waiting around for the roles to be reversed, I suppose. The unthinkable, for what it really means.

  • Jay

    I step onto these comment threads with a great sense of trepidation—but I thought it important to second the call for a bit more respect to be shown to Nietzsche. The immense insightfulness and humanity of his work are breathtaking, wonderful, enriching. His work is immortal.