• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Sally
  • Shaw
  • University College London
  • UK

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Posts Tagged ‘Philosophy of science’

As long-time readers of Quantum Diaries know I have been publishing here for a number of years and this is my 85th and last post[1]. A couple of years ago, I collected the then current collection, titled it “In Defense of Scientism,” after the title of one of the essays, and sent it off to a commercial publisher. Six months later, I got an e-mail from the editor complaining that he had lost the file and only found it by accident, and he somehow inferred that it was my fault. After that experience, it was no surprise he did not publish it.

With all the talk of self-publishing these days, I thought I would give it a try. It is easy, at least compared to finding the Higgs boson! There are a variety of options that give different levels of control, so one can pick and choose preferences – like off an á la carte menu. The simplest form of self-publishing is to go to a large commercial publisher.  The one I found would, for $50.00 USD up front and $12.00 a year, supply print on demand and e-books to a number of suppliers. Not sure that I could recover the costs from the revenue – and being a cheapskate – I decided not to go that route. There are also commissioned alternatives with no upfront costs, but I decided to interact directly with three (maybe four, if I can jump over the humps the fourth has put up) companies.  One of the companies treated their print-on-demand and digital distribution arms as distinct, even to the point of requiring different reimbursement methods. That is the disadvantage of doing it yourself, sorting it all out. The advantage of working directly with the suppliers is more control over the detailed formatting and distribution.

From then on things got fiddly[2], for example, reimbursement. Some companies would only allow payment by electronic fund transfer, others only by check. The weirdest example was one company that did electronic fund transfers unless the book was sold in Brazil or Mexico. In those cases, it is by check but only after $100.00 has been accumulated. One company verified, during account setup, that the fund transfer worked by transferring a small amount, in my case 16 cents. And then of course there are special rules if you earn any money in the USA. For USA earnings there is a 30% withholding tax unless you can document that there is a tax treaty that allows you to get around it. The USA is the only country that requires this. Fine, being an academic, I am used to jumping through hoops.

Next was the question of an International Standard Book Number (ISBN). They are not required but are recommended. That is fine since in Canada you can get them for free. Just as well since each version of the book needs a different number. The paperback needs a different number from the electronic and each different electronic format requires its own number. As I said, it is a good thing it is free. Along with the ISBN, I got a reminder that the Library of Canada requires one copy of each book that sells more than four copies and two copies if it goes over a hundred and of course a separate electronic copy if you publish electronically. Fun, fun, fun[3]. There are other implications of getting you own ISBN number. Some of the publishers would supply an ISBN free of charge but then would put the book out under their own imprint and, in some cases, give wider distribution to those books. But again, getting your own number ultimately gives you more control.

With all this research in hand, it was time to create and format the content. I had the content from the four years’ worth of Quantum Diary posts and all I had to do was put it together and edit for consistency. Actually, Microsoft Word worked quite well with various formatting features to help. I then gave it to my wife to proofread. That was a mistake; she is still laughing at some of the typos. At least there is now an order of magnitude fewer errors. I should also acknowledge the many editorial comments from successive members of the TRIUMF communications team.

The next step was to design the book cover. There comes a point in every researcher’s career when they need support and talent outside of themselves. Originally, I had wanted to superimpose a picture of a model boat on a blackboard of equations. With that vision in mind, I set about the hallways to seek and enroll the talent of a few staff members who could make it happen. After normal working hours, of course. A co-op communication student suggested that the boat be drawn on the blackboard rather than a picture superimposed. The equations were already on a blackboard and are legitimate. The boat was hand drawn by a talented lady in accounting, drawing it first onto an overhead transparency[4] and then projecting it onto a blackboard. A co-op student in the communications team produced the final cover layout according to the various colour codes and margin bleeds dictated by each publisher. For both my own and your sanity, I won’t go into all the details. In the end, I rather like how the cover turned out.

For print-on-demand, they wanted a separate pdf for the cover and for the interior. They sent very detailed instructions so that was no problem. It only took about three tries to get it correct. The electronic version was much more problematic. I wonder if the companies that produce both paper and digital get it right. I suspect not. There is a free version of a program that converts from Word to epub format but the results have some rather subtle errors, like messing up the table of contents. I ended up using one of the digital publisher’s conversion services provided as a free service. If you buy a copy and it looks messed up, I do not want to hear about it.[5] One company (the fourth mentioned above) added a novel twist. I jumped all the hoops related to banking information for wire transfers, did the USA tax stuff and then went to upload the content. Ah, I needed to download a program to upload the content. That should not have been a problem but it ONLY runs on their hardware. The last few times I used their hardware it died prematurely so they can stuff it.

Now, several months after I started the publishing process, I have jumped through all the hoops! All I have to do is lay back and let the money roll in so I can take early retirement. Well, at my age, early retirement is no longer a priori possible but at least I hope to get enough money to buy the people who helped me prepare the book a coffee. So everyone, please rush out and buy a copy. Come on, at least one of you.

As a final point, you may wonder why there is a drawing of a boat on the cover of a book about the scientific method. Sorry, to find out you will have to read the book. But I will give you a hint. It is not that I like to go sailing. I get seasick.

To receive a notice of my blogs and other writing follow me on Twitter: @musquod.

[1] I know, I have promised this before, but his time trust me. I am not like Lucy in the Charlie Brown cartoons pulling the football away.

[2] Epicurus, who made the lack of hassle the greatest good, would not have approved.

[3] Reminds me of an old Beach Boys song.

[4] An old overhead projector was found in a closet.

[5] Hey! We got through an entire conversation about formatting and word processing software without mentioning LaTeX despite me having been the TRIUMF LaTeX guru before I went over to the dark side and administration.

Share

Choose as Many as You Like

Tuesday, April 7th, 2015

I want to understand the universe.

I want to understand how the universe works.

I want to build models of how the universe works that predict the results of experiments.

I want to build models of how the universe works that predict the results of experiments, because I believe those models get closer and closer to the truth.

I want to build models of how the universe works that predict the results of experiments, because I believe those models get closer and closer to the true rules of reality.

I want to build models of how the universe works that predict the results of experiments, because I believe that although it’s unknowable whether reality has “true” rules, building better and better models is the closest we can get.

I want to build models of how the universe works that predict the results of experiments, because I believe that understanding the true rules of reality will help us understand why the universe exists.

I want to build models of how the universe works that predict the results of experiments, because I believe that understanding the true rules of reality will shed light on the Creator of the universe.

I want to build models of how the universe works that predict the results of experiments, because I believe that the more we can explain without religion the less people will rely on it.

I want to build models of how the universe works that are simple and beautiful.

I want to build models of how the universe works that are simple and beautiful, because these models have the best track record of predicting the broadest range of experimental results.

I want to build models of how the universe works that are simple and beautiful, because I believe the true rules of reality are simple and beautiful.

I want to understand enough of how the universe works that I can build machines to improve people’s lives.

I want to understand enough of how the universe works that I can find new ways to save lives and heal the sick and injured.

I want to understand enough of how the universe works that I can help us stop endangering the climate of our planet.

I want to understand how the universe works so that other people can someday find new practical ways to improve and save lives, even if I don’t quite know what they are and probably won’t work on them myself.

I want to build machines for studying how the universe works, because I find working on those machines to be challenging and fulfilling.

I want to write programs for analyzing data from experiments on how the universe works, because I find analyzing data to be challenging and fulfilling.

Choose as many as you like. If other people want to hear about it, tell them – or, if you prefer, don’t. And if you have more you’d like to add, leave them in the comments!

Share

Physicists frequently stray into the field of philosophy; notable examples include Thomas Kuhn (1922 –1996) and Henri Poincaré (1854 – 1912). This is perhaps because physicists frequently work in areas far removed from everyday experiences and, in order to be successful in communicating their ideas, underlying assumptions must be dealt with explicitly. Although less well known today than Kuhn and Poincaré, Percy Bridgman (1882 – 1961) also falls into this category. In physics, he is noted for his work on high-pressure physics, winning the Nobel Prize in 1946. In philosophy, he is credited with coining the term OPERATIONAL DEFINITION and promoting the idea of operationalism. These ideas are laid out in the 1927 book: THE LOGIC OF MODERN PHYSICS. If nothing else, it shows the folly of using MODERN in book titles. None-the-less, it is an interesting little book and, in its time, quite influential.

In his book, Bridgman introduces several interesting ideas, for example, that when one explores new areas in science, one should not be surprised that the supporting concepts have to change. Hence we should not be surprised when classical concepts fail in the relativistic or quantum domains. This illustrates why interpretations of quantum mechanics, explaining it terms of classical concepts, are poorly motivated. A related idea is that an explanation is the description of a given phenomenon in terms of familiar concepts.. Of course, with this definition, what qualifies as a valid explanation depends on what the explainee is familiar with. If one cannot succeed using established concepts, one must explain the new idea using familiar, albeit far removed, concepts But what happens when even this does not work? Bridgeman suggests that the solution is to introduce new concepts and become familiar with them. Seems reasonable to me. Thus quantum mechanics can be explained in terms of the, familiar to me, concept of the wave function; no need for many worlds and the like.

While it is natural to think of high speed (relativity) or small size (quantum mechanics) as new areas of science, Bridgman includes increased precision as well. He talks about the penumbra of uncertainty that surrounds all measurements and that is penetrated by increasing the precision of the measurements. Thus the idea of the distinct high-energy and precision frontiers, commonly discussed in modern particle physics planning exercises, goes back at least to 1927.

Bridgman was also a phenomenologist to the core. He did not believe that a priori knowledge could constrain what could happen; in his words: Experience is determined only by experience. C.I. Lewis (1983 – 1964) in his 1929 book Mind and the World-Order agrees. The similar ideas, in books of about the same time, indicate the concerns of that age.

Despite these interesting sidelights, the main idea in THE LOGIC OF MODERN PHYSICS is that concepts are defined by how they are measured; that is by the measurement operation, hence the term operationalism. So why was he interested in operational definitions? It was to avoid the problem in classical mechanics where concepts like distance and time were taken for granted. It then came as shock when the concepts proved to be rather complex when special relativity was invented. To avoid such shocks in the future, Bridgman proposed the idea of operational definitions. For example, to measure length you go down to the local Canadian Tire® store (in the USA it would be Walmart®), buy a tape measure and use it measure length. Thus the concept of length is defined by Canadian Tire®, oops, I mean by a tape measure. What if I measure length by surveying techniques that make use of tranquilization? Bridgman claimed that that is a distinctly different concept and is covered by the same term only for convenience. Here at TRIUMF, distance and location are also measured using laser tacking. This is again a different concept than the original concept of length. Things get even more complicated when we talk about the distance to stars which use again a different operation. Bridgman suggests that length loses it meaning at lengths less than the size the electron because such lengths cannot be measured. Today we would say they can be measured but length in that case is simply a parameter, in a mathematical formula, describing the scattering of particles. Hence we do not have one concept of length or distance but many, although they are the same numerically in regions where the techniques overlap.

Bridgman then goes on to consider various other concepts and how they might be defined operationally. He seems to have been very much influenced by Albert Einstein (1879 – 1955) and Einstein’s discussion of the synchronization of clocks (which actually goes back to Poincaré). The possible operational definitions of velocity are particularly interesting. In contradistinction to the definition given by Einstein based on clocks synchronized and distances measured in a fixed inertial frame, Bridgman suggests that the velocity of a car could also be defined by counting mileposts that the car passes to determine distance and using the clock on the car dashboard to measure time. This velocity can become infinite and would be useful to a person going to a distant solar system who is interested in how many of his years it takes to get there. For most purposes Einstein’s definition is more convenient and hence it is the one in textbooks though other definitions remain possible.

And on it goes. In some cases the definitions seemed quite forced. Never-the-less, three groups of people picked up on the idea of operational definitions. One group was the logical positivists. They tried to avoid theory and were pleased when a physicist gave definitions directly in terms of observables. The second group was the phycologists, who wanted a more secure foundation for their subject. The third group was in quality control and business management where Walter Shewhart (1891 – 1967) and Edwards Demming (1900 – 1993) adopted the idea.

However the concept, as the end all and be all of meaning, had its problems. Like logical positivism, it missed the idea that the meaning is in the model. While we may have different ways to measure length there is common idea behind them all. We can consider this common idea to be an abstraction from the different operational defined concepts or we can take the operational definitions as approximations to the abstract idea. One could say that operationally there is no difference between the two approaches.

Ultimately, operational definitions are useful. They tie concepts tightly to observations where they are less likely to be dislodged by future discoveries or new models. They also help eliminate fuzzy thinking. A lot of the concepts that do not have operational definitions are, in general, poorly defined. Who knows, I might even take the concept of scientific realism seriously if someone gave me an operational definition of it.

To receive a notice of my future posts and my pending book, In Defense of Scientism, follow me on Twitter: @musquod.

Share

Will Self’s CERN

Friday, January 16th, 2015

“It doesn’t look to me like the rose window of Notre Dame. It looks like a filthy big machine down a hole.” — Will Self

Like any documentary, biography, or other educational program on the radio, Will Self’s five-part radio program Self Orbits CERN is partially a work of fiction. It is based, to be sure, on a real walk through the French countryside along the route of the Large Hadron Collider, on the quest for a promised “sense of wonder”. And it is based on real tours at CERN and real conversations. But editorial and narrative choices have to be made in producing a radio program, and in that sense it is exactly the story that Will Self wants to tell. He is, after all, a storyteller.

It is a story of a vast scientific bureaucracy that promises “to steal fire from the gods” through an over-polished public relations team, with day-to-day work done by narrow, technically-minded savants who dodge the big philosophical questions suggested by their work. It is a story of big ugly new machines whose function is incomprehensible. It is the story of a walk through thunderstorms and countryside punctuated by awkward meetings with a cast of characters who are always asked the same questions, and apparently never give a satisfactory answer.

Self’s CERN is not the CERN I recognize, but I can recognize the elements of his visit and how he might have put them together that way. Yes, CERN has secretariats and human resources and procurement, all the boring things that any big employer that builds on a vast scale has to have. And yes, many people working at CERN are specialists in the technical problems that define their jobs. Some of us are interested in the wider philosophical questions implied by trying to understand what the universe is made of and how it works, but some of us are simply really excited about the challenges of a tiny part of the overall project.

“I think you understand more than you let on.”Professor Akram Khan

The central conflict of the program feels a bit like it was engineered by Self, or at least made inevitable by his deliberately-cultivated ignorance. Why, for example, does he wait until halfway through the walk to ask for the basic overview of particle physics that he feels he’s missing, unless it adds to the drama he wants to create? By the end of the program, he admits that asking for explanations when he hasn’t learned much background is a bit unfair. But the trouble is not whether he knows the mathematics. The trouble, rather, is that he’s listened to a typical, very short summary of why we care about particle physics, and taken it literally. He has decided in advance that CERN is a quasi-religious entity that’s somehow prepared to answer big philosophical questions, and never quite reconsiders the discussion based on what’s actually on offer.

If his point is that particle physicists who speak to the public are sometimes careless, he’s absolutely right. We might say we are looking for how or why the universe was created, when really we mean we are learning what it’s made of and the rules for how that stuff interacts, which in turn lets us trace what happened in the past almost (but not quite) back to the moment of the Big Bang. When we say we’re replicating the conditions at that moment, we mean we’re creating particles so massive that they require the energy density that was present back then. We might say that the Higgs boson explains mass, when more precisely it’s part of the model that gives a mechanism for mass to exist in models whose symmetries forbid it. Usually a visit to CERN involves several different explanations from different people, from the high-level and media-savvy down to the technical details of particular systems. Most science journalists would put this information together to present the perspective they wanted, but Self apparently takes everything at face value, and asks everyone he meets for the big picture connections. His narrative is edited to literally cut off technical explanations, because he wants to hear about beauty and philosophy.

Will Self wants the people searching for facts about the universe to also interpret them in the broadest sense, but this is much harder than he implies. As part of a meeting of the UK CMS Collaboration at the University of Bristol last week, I had the opportunity to attend a seminar by Professor James Ladyman, who discussed the philosophy of science and the relationship of working scientists to it. One of the major points he drove home was just how specialized the philosophy of science can be: that the tremendous existing body of work on, for example, interpreting Quantum Mechanics requires years of research and thought which is distinct from learning to do calculations. Very few people have had time to learn both, and their work is important, but great scientific or great philosophical work is usually done by people who have specialized in only one or the other. In fact, we usually specialize a great deal more, into specific kinds of quantum mechanical interactions (e.g. LHC collisions) and specific ways of studying them (particular detectors and interactions).

Toward the end of the final episode, Self finds himself at Voltaire’s chateau near Ferney, France. Here, at last, is what he is looking for: a place where a polymath mused in beautiful surroundings on both philosophy and the natural world. Why have we lost that holistic approach to science? It turns out there are two very good reasons. First, we know an awful lot more than Voltaire did, which requires tremendous specialization discussed above. But second, science and philosophy are no longer the monopoly of rich European men with leisure time. It’s easy to do a bit of everything when you have very few peers and no obligation to complete any specific task. Scientists now have jobs that give them specific roles, working together as a part of a much wider task, in the case of CERN a literally global project. I might dabble in philosophy as an individual, but I recognize that my expertise is limited, and I really enjoy collaborating with my colleagues to cover together all the details we need to learn about the universe.

In Self’s world, physicists should be able to explain their work to writers, artists, and philosophers, and I agree: we should be able to explain it to everyone. But he — or at least, the character he plays in his own story — goes further, implying that scientific work whose goals and methods have not been explained well, or that cannot be recast in aesthetic and moral terms, is intrinsically suspect and potentially valueless. This is a false dichotomy: it’s perfectly possible, even likely, to have important research that is often explained poorly! Ultimately, Self Orbits CERN asks the right questions, but it is too busy musing about what the answers should be to pay attention to what they really are.

For all that, I recommend listening to the five 15-minute episodes. The music is lovely, the story engaging, and the description of the French countryside invigorating. The jokes were great, according to Miranda Sawyer (and you should probably trust her sense of humour rather than the woefully miscalibrated sense of humor that I brought from America). If you agree with me that Self has gone wrong in how he asks questions about science and which answers he expects, well, perhaps you will find some answers or new ideas for yourself.

Share

It seems some disagreements are interminable: the Anabaptists versus the Calvinists, capitalism versus communism, the Hatfields versus the McCoys, or string theorists versus their detractors. It is the latter I will discuss here although the former may be more interesting. This essay is motivated[1] by a comment in the December 16, 2014 issue of Nature by George Ellis and Joe Silk. The comment takes issue with attempts by some string theorists and cosmologists to redefine the scientific method by eliminating the need for experimental testing and relying on elegance or similar criteria instead. I have a lot of sympathy with Ellis and Silk’s point of view but believe that it is up to scientists to define what science is and that hoping for deliverance by outside people, like philosophers, is doomed to failure

To understand what science is and what science is not, we need a well-defined model for how science behaves. Providing that well-defined model is the motivation behind each of my essays. The scientific method is quite simple: build models of how the universe works based on observation and simplicity. Then test them by comparing their predictions against new observation. Simplicity is needed since observations underdetermine the models (see for example: Willard Quine’s (1908 –2000) essay: The Two Dogmas of Empiricism).  Note also that what we do is build models: the standard model of particle physics, the nuclear shell model, string theory, etc. Quine refers to the internals of the models as myths and fictions. Henri Poincaré (1854 – 1912) talks of conventions and Hans Vaihinger (1852 –1933) of the philosophy of as if otherwise known as fictionalism. Thus it is important to remember that our models, even the so-called theory of everything, are only models and not reality.

It is the feedback loop of observation, model building and testing against new observation that define science and give it its successes. Let me repeat: The feedback loop is essential. To see why, consider example of astrology and why scientists reject it. Its practitioners consider it to be the very essence of elegance. Astrology uses careful measurements of current planetary locations and mathematics to predict their future locations, but it is based on an epistemology that places more reliance on the eloquence of ancient wisdom than on observation. Hence there is no attempt to test astrological predictions against observations. That would go against their fundamental principles of eloquence and the superiority of received knowledge to observation. Just as well, since astrological predictions routinely fail. Astrology’s failures provide a warning to those who wish to replace prediction and simplicity with other criteria. The testing of predictions against observation and simplicity are hard taskmasters and it would be nice to escape their tyranny but that path is fraught with danger, as astrology illustrates. The feedback loop from science has even been picked up by the business management community and has been built into the very structure of the management standards (see ISO Annex SL for example). It would be shame if management became more scientific than physics.

But back to string theory. Gravity has always been a tough nut to crack. Isaac Newton (1643 – 1727) proposed the decidedly inelegant idea of instantaneous action at a distance and it served well until 1905 and the development of special theory of relativity. Newton’s theory of gravity and special relativity are inconsistent since the latter rules out instantaneous action at a distance. In 1916, Albert Einstein (1879 – 1955) with an honorable mention to David Hilbert (1862 – 1943) proposed the general theory of relativity to solve the problem. In 1919, the prediction of the general theory of relativity for the bending of light by the sun was confirmed by an observation by Arthur Eddington (1882 – 1944). Notice the progression: conflict between two models, proposed solution, confirmed prediction, and then acceptance.

Like special relativity and Newtonian gravity, general relativity and quantum mechanics are incompatible with one another. This has led to attempts to generate a combined theory. Currently string theory is the most popular candidate, but it seems to be stuck at the stage general relativity was in 1917 or maybe even 1915: a complicated (some would say elegant, others messy) mathematical theory but unconfirmed by experiment. Although progress is definitely being made, string theory may stay where it is for a long time. The problem is that the natural scale of quantum gravity is the Planck mass and this scale is beyond what we can explore directly by experiment. However, there is one place that quantum gravity may have left observable traces and that is in its role in the early Universe. There are experimental hints that may indicate a signature in the cosmic microwave background radiation but we must await further experimental results. In the meantime, we must accept that current theories of quantum gravity are doubly uncertain. Uncertain, in the first instance, because, like all scientific models, they may be rendered obsolete by new a understanding and uncertain, in the second instance, because they have not been experimentally verified through testable predictions.

Let’s now turn to the question of multiverses. This is an even worse dog’s breakfast than quantum gravity. The underlying problem is the fine tuning of the fundamental constants needed in order for life as we know it to exist. What is needed for life, as we do not know it, to exist is unknown. There are two popular ideas for why the Universe is fined tuned. One is that the constants were fine-tuned by an intelligent designer to allow for life as we know it. This explanation has the problem that by itself it can explain anything but predict nothing. An alternate is that there are many possible universes, all existing, and we are simply in the one where we can exist. This explanation has the problem that by itself it can explain anything but predict nothing.  It is ironic that to avoid an intelligent designer, a solution based on an equally dubious just so story is proposed. Since we are into just so stories, perhaps we can compromise by having the intelligent designer choosing one of the multiverses as the one true Universe. This leaves the question of who the one true intelligent designer is. As an old farm boy, I find the idea that Audhumbla, the cow of the Norse creation myth, is the intelligent designer to be the most elegant. Besides the idea of elegance, as a deciding criterion in science, has a certain bovine aspect to it. Who decides what constitutes elegance? Everyone thinks their own creation is the most elegant. This is only possible in Lake Wobegon, where all the women are strong, all the men are good-looking, and all the children are above average (A PRAIRIE HOME COMPANION – Garrison Keillor (b. 1942)). Not being in Lake Wobegon, we need objective criteria for what constitutes elegance. Good luck with that one.

Some may think the discussion in the last paragraph is frivolous, and quite by design it is.  This is to illustrate the point that once we allow the quest for knowledge to escape from the rigors of the scientific method’s feedback loop all bets are off and we have no objective reason to rule out astrology or even the very elegant Audhumbla. However, the idea of an intelligent designer or multiverses can still be saved if they are an essential part of a model with a track record of successful predictions. For example, if that animal I see in my lane is Fenrir, the great gray wolf, and not just a passing coyote, then the odds swing in favor of Audhumbla as the intelligent designer and Ragnarok is not far off. More likely, evidence will eventually be found in the cosmic microwave background or elsewhere for some variant of quantum gravity. Until then, patience (on both sides) is a virtue.

Though the mills of science grind slowly;
Yet they grind exceeding small;
Though with patience they stand waiting,
With exactness grind they all.[2]

[1] I have already broken my new year’s resolution to post no more philosophy of science blogs but this is the last, I promise.

[2] With apologies to Henry Wadsworth Longfellow (1807 – 1882)

Share

Not all philosophy is useless.

Friday, December 5th, 2014

In this, the epilogue to my philosophic musing, I locate my view of the scientific method within the landscape of various philosophical traditions and also tie it into my current interest of project management. As strange as it may seem, this triumvirate of the scientific method, philosophy and management meet in the philosophic tradition known as pragmatism and in the work of W. Edwards Deming (1900 – 1993), a scientist and management guru who was strongly influenced by the pragmatic philosopher C.I. Lewis (1883 – 1964), who in turn strongly influenced business practices. And I do mean strongly in both cases. The thesis of this essay is that Lewis, the pragmatic philosopher, has had influence in two directions: in business practice and in the philosophy of science. Surprisingly, my views on the scientific method are very much in this pragmatic tradition and not crackpot.

The pragmatic movement was started by Charles S. Peirce (1839 – 1914) and further developed by Williams James (1842 – 1910) and John Dewey (1859 – 1952). The basic idea of philosophic pragmatism is given by Peirce in his pragmatic maxim as: “To ascertain the meaning of an intellectual conception one should consider what practical consequences might result from the truth of that conception—and the sum of these consequences constitute the entire meaning of the conception.” Another aspect of the pragmatic approach to philosophic questions was that the scientific method was taken as given with no need for justification from the outside, i.e. the scientific method was used as the definition of knowledge.
How does this differ from the workaday approach to defining knowledge? Traditionally, going back at least to Plato (428/427 or 424/423 BCE – 348/347 BCE) knowledge has been defined as:
1) Knowledge – justified true belief
The leaves open the question of how belief is justified and since no justification is ever 100% certain, we can never be sure the belief is true. That is a definite problem. No wonder the philosophic community has spent two and a half millennia in fruitless efforts to make sense of it.

A second definition of knowledge predates this and is associated with Protagoras (c. 490 B.C. – c. 420 B.C.) and the sophists:
2) Knowledge – what you can convince people is true
Essentially, the argument is that since we cannot know that a belief is true with 100% certainty; what is important is what we can convince people of. This same basic idea shows up in the work of modern philosophers of science with the idea that scientific belief is basically a social phenomenon and what is important is what the community convinces itself is true. This was part of Thomas Kuhn’s (1922 – 1996) thesis.

While we cannot know what is true, we can know what is useful. Following the lead of scientists, the pragmatists effectively defined knowledge as:
3) Knowledge – information that helps you predict and modify the future
If we take predicting and modifying the future as the practical consequence of information, this definition of knowledge is consistent with the pragmatic maxim. The standard model of particle physics is not knowledge by the strict application of definition 1) since it is not completely true; however it is knowledge by definition 3 since it helps us predict and modify the future. The scientific method is built on definition 3). The modify clause is included in the definition since the pragmatists insisted on that aspect of knowledge. For example, C.I. Lewis said that without the ability to act there is no knowledge.

The third definition of knowledge given above does not correspond to what many people think of as knowledge so Dewy suggested using the term “warranted assertions” rather than knowledge: The validity of the standard model is a warranted assertion. Fortunately, this terminology never caught on. In contrast, James’s pragmatic idea of “truth’s cash value”, derided at the time, has caught on. In a recent book “How to Measure Anything,” on risk management, Douglas W. Hubbard spends a lot of space on what is essentially the cash value of information. In business, that is what is important. The pragmatists were, perhaps, just a bit ahead of their time. Hubbard, whether he knows it or not, is a pragmatist.
Dewey coined the term “instrumentalism” to describe the pragmatic approach. An idea or a belief is like a hand, an instrument for coping. A belief has no more metaphysical status than a fork. When your fork proves inadequate to the task of eating soup, it makes little sense to argue about whether there is something inherent in the nature of forks or something inherent in the nature of soup that accounts for the failure. You just reach for a spoon . However, most pragmatists did not consider themselves to be instrumentalists but rather used the pragmatic definition of knowledge to define what is meant by real.

Now I turn to C.I. Lewis. He is alternately regarded as the last of the classical pragmatists or the first of the neo-pragmatists. He was quite influential in his day as a professor at Harvard from 1920 to his retirement in 1953. In particular, his 1929 book “Mind and the World Order” had a big influence on epistemology and surprisingly on ISO management standards. One can see a lot of the ideas developed by Kuhn already present in the work of C.I. Lewis , for example, the role of theory in interpreting observation. Or as Deming, influenced by Lewis, expressed it: “There is no knowledge without theory.” As a theorist, I like that. At the time, this was quite radical. The logical positivists took the opposite tack and tried to eliminate theory from their epistemology. Lewis and Kuhn argued this was impossible. The idea that theory was necessary for knowledge was not new to Lewis but is also present in the works of Henri Poincaré (1854 – 1912) who was duly reference by Lewis.

Another person Lewis influenced was Willard V. O. Quine (1908 – 2000), although Quine and Lewis did not agree. Quine is perhaps best known outside the realm of pure philosophy for the Duhem-Quine thesis, namely that it is impossible to test a scientific hypothesis in isolation because an empirical test of the hypothesis requires one or more background assumptions. This was the death knell of any naïve interpretation of Sir Karl Popper’s (1902 –1994) idea that science is based on falsification. But Quine’s main opponents were the logical positivists. Popper was just collateral damage. Quine published a landmark paper in 1951: “Two Dogmas of Empiricism”. I would regard this paper as the high point in the discussion of the scientific method by a philosopher and it reasonably readable (unlike Lewis’s “The Mind and the World Order”). Beside the Duhem-Quine thesis, the other radical idea is that observation underdetermines scientific models and that simplicity and conservatism are necessary to fill the gap. This idea also goes back to Poincaré and his idea of conventionalism – much of what is regarded as fact is only convention.

To a large extent my ideas match well with the ideas in “Two Dogmas of Empiricism.” Quine summarizes it nicely as: “The totality of our so-called knowledge or beliefs, from the most casual matters of geography and history to the profoundest laws of atomic physics or even of pure mathematics and logic, is a man-made fabric which impinges on experience only along the edges.” and “The edge of the system must be kept squared with experience; the rest, with all its elaborate myths or fictions, has as its objective the simplicity of laws.” Amen.

Unfortunately, after the two dogmas of empiricism were brought to light, the philosophy of science regressed. In a recent discussion of simplicity in science I came across, there was neither a single mention of Quine’s work nor his correct identification of the role of simplicity – to relieve the under determination of models by observation. Philosophers found no use for his ideas and have gone back to definition 1) of knowledge. Sad

Where philosophers have dropped the ball it was picked by people in, of all places management. Two people influenced by Lewis were Walter A. Shewhart (1891 – 1967) and Edwards Deming. It is said that Shewhart read Lewis’s book fourteen times and Deming read it nine times. Considering how difficult that book is, it probably required that many readings just to comprehend it. Shewhart is regarded as the father of statistical process control, a key aspect of quality control. He also invented the control chart, a key component of statistical process control. Shewhart’s 1939 book “Statistical Method from the viewpoint of Quality Control” is a classic in the field but it devoted a large part to showing how his ideas are consistent with Lewis’s epistemology. In this book, Shewhart introduced the Shewhart cycle, which was modified by Deming (and sometimes called the Deming cycle). Under its current name Do-Plan-Check-Act (DPCA cycle) it forms the basis of the ISO management standards.

shewhart

The original Shewhart cycle as given in Shewhart’s book.

What is this cycle? Here it is as captured from Shewhart’s book. This is the first place where production is seen as part of a cycle and in the included caption Shewhart explicitly relates it to the scientific method as given by Lewis. Deming added another step to the cycle, which strikes me as unnecessary; the act step. It can easily be incorporated in the specification or plan stage (as it is in Shewhart’s diagram). But Deming was influenced by Lewis who regarded knowledge without the possibility of acting as impossible, hence the act step. This idea has become ingrained in ISO management standards as the slogan “continual improvement” (Clause 10 in the standards). To see the extent Deming was guided by Lewis’s ideas just look at Deming’s 1993 book “The New Economics.” He summarizes his approach in what he calls a system of profound knowledge. This has four parts: knowledge of system, knowledge of variation, theory of knowledge and knowledge of physiology. The one that seems out of place is the third; why include theory of knowledge? Deming believed that this was necessary for running a company and he explicitly refers to Lewis’s 1929 book. Making the reading of Lewis’s book mandatory for business managers would certainly have the desirable effect of cutting down the number of managers. To be fair to Deming, he does suggest starting in about the middle of the book. We have two unbroken chain – 1) Peirce, Lewis, Shewhart, Deming, ISO management standards and 2) Pierce, Lewis, Quine, my philosophical musings . It reminds one of James Burke’s TV program “Connections”.

Popper may be the person many scientists think of to justify how they work but Quine would probably be better and Quine’s teacher, C.I. Lewis, through Deming, has provided the philosophic foundation for business management. Within the context of definition 3) for knowledge both science and business have been very successful. Your reading of this essay required both. In contradistinction, standard western philosophy based on definition 1) has largely failed; philosophers still do not know how to acquire knowledge. However, not all philosophy is useless, some of it is pragmatic.

Share

This blog is all about particle physics and particle physicists. We can all agree, I suppose, on the notion of the particle physicist, right? There are even plenty of nice pictures up here! But do we know or are we aware of what a particle really is? This fundamental question tantalized me from the very beginning of my studies and before addressing more involved topics I think it is worth spending some time on this concept. Through the years I probably changed my opinion several times, according to the philosophy underlying the topic that I was investigating. Moreover, there’s probably not a single answer to this question.

  1. The Standard Model: from geometry to detectors

The human mind conceived the Standard Model of Particle Physics to give a shape on the blackboard to the basic ingredients of particle physics: it is a field theory, with quantization rules, namely a quantum field theory and its roots go deep down to differential geometry.
But we know that “particles” like the Higgs boson have been discovered through complex detectors, relying on sophisticated electronic systems, tons of Monte Carlo simulations and data analysis. Quite far away from geometry, isn’t it?
So the question is: how do we fill this gap between theory and experiment? What do theoreticians think about and experimentalists see through the detectors? Furthermore, does a particle’s essence change from its creation to its detection?

  1. Essence and representation: the wavefunction

 Let’s start with simple objects, like an electron. Can we imagine it as a tiny thing floating here and there? Mmm. Quantum mechanics already taught us that it is something more: it does not rotate around an atomic nucleus like the Earth around the Sun (see, e.g., Bohr’s model). The electron is more like a delocalized “presence” around the nucleus quantified by its “wavefunction”, a mathematical function that gives the probability of finding the electron at a certain place and time.
Let’s think about it: I just wrote that the electron is not a localized entity but it is spread in space and time through its wavefunction. Fine, but I still did not say what an electron is.

I have had long and intensive discussions about this question. In particular I remember one with my housemate (another theoretical physicist) that was about to end badly, with the waving of frying pans at each other. It’s not still clear to me if we agreed or not, but we still live together, at least.

Back to the electron, we could agree on considering its essence as its abstract definition, namely being one of the leptons in the Standard Model. But the impossibility of directly accessing it forces me to identify it with its most trustful representation, namely the wavefunction. I know its essence, but I cannot directly (i.e. with my senses) experience it. My human powers stop to the physical manifestation of its mathematical representation: I cannot go further.
Renè Magritte represented the difference between the representation of an object and the object itself in a famous painting “The treachery of images”:

magritte_pipe

“Ceci n’est pas une pipe”, it says, namely “This is not a pipe”. He is right, the picture is its representation. The pipe is defined as “A device for smoking, consisting of a tube of wood, clay, or other material with a small bowl at one end” and we can directly experience it. So its representation is not the pipe itself.

As I explained, this is somehow different in the case of the electron or other particles, where experience stops to the representation. So, according to my “humanity”, the electron is its wavefunction. But, to be consistent with what I just claimed: can we directly feel its wavefunction? Yes, we can. For example we can see its trace in a cloud chamber, or more elaborate detectors. Moreover, electricity and magnetism are (partly) manifestations of electron clouds in matter, and we experience those in everyday life.

bubbleplakat

You may wonder why I go through all these mental wanderings: just write down your formulas, calculate and be happy with (hopefully!) discoveries.

I do it because philosophy matters. And is nice. And now that we are a bit more aware of the essence of things that we are investigating, we can move a step forward and start addressing Quantum Chromo Dynamics (QCD), from its basic foundations to the latest results released by the community. I hope to have sufficiently stimulated your curiosity to follow me during the next steps!

Again, I want to stress that this is my own perspective, and maybe someone else would answer these questions in a different way. For example, what do you think?

Share

Good Management is Science

Friday, October 10th, 2014

Management done properly satisfies Sir Karl Popper’s (1902 – 1994) demarcation criteria for science, i.e. using models that make falsifiable or at least testable predictions. That was brought home to me by a book[1] by Douglas Hubbard on risk management where he advocated observationally constrained (falsifiable or testable) models for risk analysis evaluated through Monte Carlo calculations. Hmm, observationally constrained models and Monte Carlo calculations, sounds like a recipe for science.

Let us take a step back. The essence of science is modeling how the universe works and checking the assumptions of the model and its predictions against observations. The predictions must be testable. According to Hubbard, the essence of risk management is modeling processes and checking the assumptions of the model and its predictions against observations. The predictions must be testable. What we are seeing here is a common paradigm for knowledge in which modeling and testing against observation play a key role.

The knowledge paradigm is the same in project management. A project plan, with its resource loaded schedules and other paraphernalia, is a model for how the project is expected to proceed. To monitor a project you check the plan (model) against actuals (a fancy euphemism for observations, where observations may or may not correspond to reality). Again, it reduces back to observationally constrained models and testable predictions.

The foundations of science and good management practices are tied even closer together. Consider the PDCA cycle for process management that is present, either implicitly or explicitly, in essentially all the ISO standards related to management. It was originated by Walter Shewhart (1891 – 1967), an American physicist, engineer and statistician, and popularized by Edwards Deming (1900 – 1993), an American engineer, statistician, professor, author, lecturer and management consultant. Engineers are into everything. The actual idea of the cycle is based on the ideas of Francis Bacon (1561 – 1629) but could equally well be based on the work of Roger Bacon[2] (1214 – 1294). Hence, it should probably be called the Double Bacon Cycle (no, that sounds too much like a breakfast food).

But what is this cycle? For science, it is: plan an experiment to test a model, do the experiment, check the model results against theCapture observed results, and act to change the model in response to the new information from the check stage or devise more precise tests if the predictions and observations agree. For process management replace experiment with production process. As a result, you have a model for how the production process should work and doing the process allows you to test the model. The check stage is where you see if the process performed as expected and the act stage allows you to improve the process if the model and actuals do not agree. The key point is the check step. It is necessary if you are to improve the process; otherwise you do not know what is going wrong or, indeed, even if something is going wrong. It is only possible if the plan makes predictions that are falsifiable or at least testable. Popper would be pleased.

There is another interesting aspect of the ISO 9001 standard. It is based on the idea of processes. A process is defined as an activity that converts inputs into outputs. Well, that sound rather vague, but the vagueness is an asset, kind of like degrees of freedom in an effective field theory. Define them as you like but if you choose them incorrectly you will be sorry. The real advantage of effective field theory and the flexible definition of process is that you can study a system at any scale you like. In effective field theory, you study processes that operate at the scale of the atom, the scale of the nucleus or the scale of the nucleon and tie them together with a few parameters. Similarly with processes, you can study the whole organization as a process or drill down and look at sub process at any scale you like, for CERN or TRIUMF that would be down to the last magnet. It would not be useful to go further and study accelerator operations at the nucleon scale. At a given scale different processes are tied together by their inputs and outputs and these are also used to tie process at different scales.

As a theoretical physicist who has gone over to the dark side and into administration, I find it amusing to see the techniques and approaches from science being borrowed for use in administration, even Monte Carlo calculations. The use of similar techniques in science and administration goes back to the same underlying idea: all true knowledge is obtained through observation and its use to build better testable models, whether in science or other walks of life.

[1] The Failure of Risk Management: Why It’s Broken and How to Fix It by Douglas W. Hubbard (Apr 27, 2009)

[2] Roger Bacon described a repeating cycle of observation, hypothesis, and experimentation.

Share

Why pure research?

Thursday, October 2nd, 2014

With my first post on Quantum Diaries I will not address a technical topic; instead, I would like to talk about the act (or art) of “studying” itself. In particular, why do we care about fundamental research, pure knowledge without any practical purpose or immediate application?

A. Flexner in 1939 authored a contribution to Harper’s Magazine (issue 179) named “The usefulness of useless knowledge”. He opens the discussion with an interesting question: “Is it not a curios fact that in a world steeped in irrational hatreds which threaten civilization itself, men and women – old and young – detach themselves wholly or partly from the angry current of daily life to devote themselves to the cultivation of beauty, to the extension of knowledge […] ?”

Nowadays this interrogative is still present, and probably the need for a satisfactory answer is even stronger.

From a pragmatic point of view, we can argue that there are many important applications and spin-offs of theoretical investigations into the deep structure of Nature that did not arise immediately after the scientific discoveries. This is, for example, the case of QED and antimatter, the theories for which date back to the 1920s and are nowadays exploited in hospitals for imaging purposes (like in PET, positron emission tomography). The most important discoveries affecting our everyday life, from electricity to the energy bounded in the atom, came from completely pure and theoretical studies: electricity and magnetism, summarized in Maxwell’s equations, and quantum mechanics are shining examples.

It may seem that it is just a matter of time: “Wait enough, and something useful will eventually pop out of these abstract studies!” True. But that would not be the most important answer. To me this is: “Pure research is important because it generates knowledge and education”. It is our own contribution to the understanding of Nature, a short but important step in a marvelous challenge set up by the human mind.

Personally, I find that research into the yet unknown aspects of Nature responds to some partly conscious and partly unconscious desires. Intellectual achievements provide a genuine ‘spiritual’ satisfaction, peculiar to the art of studying. For sake of truth I must say that there are also a lot of dark sides: frustration, stress, graduate-depression effects, geographical and economic instability and so on. But leaving for a while all these troubles aside, I think I am pretty lucky in doing this job.

source_of_knowledge

Books, the source of my knowledge

During difficult times from the economic point of view, it is legitimate to ask also “Why spend a lot of money on expensive experiments like the Large Hadron Collider?” or “Why fund abstract research in labs and universities instead of investing in more socially useful studies?”

We could answer by stressing again the fact that many of the best innovations came from the fuzziest studies. But in my mind the ultimate answer, once for all, relies in the power of generating culture, and education through its diffusion. Everything occurs within our possibilities and limitations. A willingness to learn, a passion for teaching, blackboards, books and (super)computers: these are our tools.

Citing again Flexner’s paper: “The mere fact spiritual and intellectual freedoms bring satisfaction to an individual soul bent upon its own purification and elevation is all the justification that they need. […] A poem, a symphony, a painting, a mathematical truth, a new scientific fact, all bear in themselves all the justification that universities, colleges and institutes of research need or require.”

Last but not least, it is remarkable to think about how many people from different parts of the world may have met and collaborated while questing together after knowledge. This may seem a drop in the ocean, but research daily contributes in generating a culture of peace and cooperation among people with different cultural backgrounds. And that is for sure one of the more important practical spin-offs.

Share

Isaac Asimov (1920 – 1992) “expressed a certain gladness at living in a century in which we finally got the basis of the universe straight”. Albert Einstein (1870 – 1955) claimed: “The most incomprehensible thing about the world is that it is comprehensible”. Indeed there is general consensus in science that not only is the universe comprehensible but is it mostly well described by our current models. However, Daniel Kahneman counters: “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance”.

Well, that puts a rather different perspective on Asimov’s and Einstein’s claims.  So who is this person that is raining on our parade? Kahneman is a psychologist who won the 2002 Nobel Prize in economics for his development of prospect theory. A century ago everyone quoted Sigmund Freud (1856 – 1939) to show how modern they were. Today, Kahneman seems to have assumed that role.[1]

Kahneman’s Nobel Prize winning prospect theory, developed with Amos Tversky (1937 –1996), replaced expected utility theory. The latter assumed that people made economic choices based on the expected utility of the results, that is they would behave rationally. In contrast, Kahneman and company have shown that people are irrational in well-defined and predictable ways. For example, it is understood that the phrasing of a question can (irrationally) change how people answer, even if the meaning of the question is the same.

Kahneman’s book, Thinking, Fast and Slow, really should be required reading for everyone. It explains a lot of what goes on (gives the illusion of comprehension?) and provides practical tips for thinking rationally. For example, when I was on a visit in China, the merchants would hand me a calculator to type in what I would pay for a given item. Their response to the number I typed in was always the same: You’re joking, right?  Kahneman would explain that they were trying to remove the anchor set by the first number entered in the calculator. Anchoring is a common aspect of how we think.

Since, as Kahneman argues, we are inherently irrational one has to wonder about the general validity of the philosophic approach to knowledge; an approach based largely on rational argument. Science overcomes our inherent irrationality by constraining our rational arguments by frequent, independently-repeated observations.  Much as with project management, we tend to be irrationally overconfident of our ability to estimate resource requirements.  Estimates of project resource requirements not constrained by real world observations leads to the project being over budget and delivered past deadlines. Even Kahneman was not immune to this trap of being overly optimistic.

Kahneman’s cynicism has been echoed by others. For example, H.L. Mencken (1880 –1956) said:  “The most common of all follies is to believe passionately in the palpably not true. It is the chief occupation of mankind”. Are the cynics correct? Is our belief that the universe is comprehensible, and indeed mostly understood, a mirage based on our unlimited ability to ignore our ignorance? A brief look at history would tend to support that claim.  Surely the Buddha, after having achieved enlightenment, would have expressed relief and contentment for living in a century in which we finally got the basis of the universe straight. Saint Paul, in his letters, echoes the same claim that the universe is finally understood. René Descartes, with the method laid out in the Discourse on the Method and Principles of Philosophy, would have made the same claim.  And so it goes, almost everyone down through history believes that he/she comprehends how the universe works. I wonder if the cow in the barn has the same illusion. Unfortunately, each has a different understanding of what it means to comprehend how the universe works, so it is not even possible to compare the relative validity of the different claims. The unconscious mind fits all it knows into a coherent framework that gives the illusion of comprehension in terms of what it considers important. In doing so, it assumes that what you see is all there is.  Kahneman refers to this as WYSIATI (What You See Is All There Is).

To a large extent the understandability of the universe is mirage based on WYSIATI—our ignorance of our ignorance. We understand as much as we are aware of and capable of understanding; blissfully ignoring the rest. We do not know how quantum gravity works, if there is intelligent life elsewhere in the universe[2], or for that matter what the weather will be like next week. While our scientific models correctly describe much about the universe, they are, in the end, only models and leave much beyond their scope, including the ultimate nature of reality.

To receive a notice of future posts follow me on Twitter: @musquod.

[1] Let’s hope time is kinder to Kahneman than it was to Freud.

[2] Given our response to global warming, one can debate if there is intelligent life on earth.

Share