• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for October, 2013

A Nobel Prize most appreciated at CERN

Tuesday, October 8th, 2013

The whole of CERN was elated today to learn that the Nobel Prize for Physics had been awarded this year to Professors François Englert and Peter Higgs for their theoretical work on what is now known as the Brout-Englert-Higgs mechanism. This mechanism explains how all elementary particles get their masses.


CERN had good reason to celebrate, since last year on 4 July, scientists working on LHC experiments proudly announced the discovery of a new particle, which was later confirmed to be a Higgs boson. This particle proves that the theory Robert Brout, François Englert and Peter Higgs developed, along with others, in 1964 was indeed correct.

The Higgs boson discovery was essential to establish their theory so we are all happy to see their work (and to some extent, our work) acknowledged with this prestigious award.

It took another decade before Steve Weinberg, co-recipient of the Nobel Prize in 1979, saw the full implication of their work while unifying two fundamental forces, the electromagnetic and weak forces, as Peter Higgs explained in July at the European Physical Society meeting of the Particle Physics division, where he gave a highly appreciated presentation. There he detailed the work of all those who preceded him, including Englert and Brout, in bringing key elements that enabled him to conceive his own work.

Peter Higgs recalled how it all began with pioneering work on “spontaneous symmetry breaking” done by Yoichiro Nambu in 1960 (for which he shared the Nobel Prize in 2008). Nambu himself was inspired by Robert Schrieffer, a condensed matter physicist who had developed similar concepts for the theory of superconductivity with John Bardeen and Leon Cooper (1972 Nobel Prize).

Spontaneous symmetry breaking is central in the Brout-Englert-Higgs mechanism rewarded today by the Swedish Academy of Science.

Jeffrey Goldstone then introduced a scalar field model often referred to as the “Mexican hat” potential while another condensed matter theorist, Philip Anderson (Nobel Prize in 1977) showed how to circumvent some problems pointed out by Goldstone.

Then, Englert and Brout published their paper, where the mechanism was finally laid out. Peter Higgs, who was working entirely independently from Brout and Englert, had his own paper out a month later with a specific mention of an associated boson. Tom Kibble, Gerald Guralnik and Carl Hagen soon after contributed additional key elements to complete this theory.

“I had to mention this boson specifically because my paper was first rejected for lack of concrete predictions”, Peter Higgs explained good-heartedly in his address last summer. This explicit mention of a boson is partly why his name got associated with the now famous boson.

The history of the Brout-Englert-Higgs mechanism just goes to show how in theory just like in experimental physics, it takes lots of people contributing good ideas, a bit of luck but mostly great collaboration to make ground-breaking discoveries.

The thousands of physicists, engineers and technicians who made the discovery of the Higgs boson possible at the LHC are also all celebrating today.

Pauline Gagnon

To find out more about the Higgs boson, here is a 25-minute recorded lecture I gave at CERN on Open Days

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.




The Patient Laureates

Tuesday, October 8th, 2013
Professors Englert (right) and Higgs - 2013 Physics Nobel Laureates.

Professors Englert (left) and Higgs – 2013 Physics Nobel Laureates.

This morning Professors Peter Higgs and Francois Englert were awarded the Nobel Prize for Physics for their predictions, made in 1964, of a mechanism which explains how certain fundamental particles such as quarks and electrons acquire mass. The mechanism is a key constituent of the Standard Model, our best model for explaining the interaction of fundamental particles. The award crowns a pair of remarkable careers and concludes a gloriously romantic story.

Francois Englert spoke directly to the press following the announcement and declared that he was “extraordinarily happy to have the recognition of this extraordinary award”. He intends to congratulate Peter Higgs on the “very important and excellent work” which he completed during the 1960s.

Here are some of the key milestones on the long and remarkable journey of these two Nobel laureates.

A fortunate rejection

In August 1964, Robert Brout and Francois Englert of the Free University of Brussels published a landmark paper which detailed the mechanism by which elementary particles such as quarks and electrons acquire mass. At around the same time, Peter Higgs of Edinburgh University submitted two papers on what is now known as the Higgs field to the journal Physics Letters. The second of those papers was rejected – and a good thing it was too. Respected physicist, Yoichiro Nambu, who reviewed Higgs’ paper suggested that he may wish to elaborate on his theory’s physical implications. In response, Higgs added a paragraph which said that an excitation of the Higgs field would yield a new particle. This particle came to be known as the Higgs boson.

Higgs resubmitted the paper to an opposition journal, Physical Review Letters, which published it later in October 1964.

A rather youthful Peter Higgs in 1954.

A rather youthful Peter Higgs in 1954.

Accurate predictions but no cigar

In the mid 1990’s the Higgs was back in the public eye. Although it had not yet been observed, it had enabled the Standard Model to make a number of successful predictions, including the discovery of the top quark at 176 GeV made by Fermilab’s Tevatron.

Experiments at Fermilab’s Tevatron and CERN’s Large Electron Positron (LEP) Collider had concluded that the Higgs must exist above 117 GeV, but neither was sensitive enough to probe at these energy levels. Enter the mammoth Large Hadron Collider (LHC) into the fray in 2008.  This beastly circular accelerator, with a circumference of 27 km and phenomenally powerful electromagnets, promised collision energies approaching 14 TeV. So the observation of the Higgs boson was considered, surely, imminent – until the LHC blew up after nine days of operation and was closed down for more than a year of repairs.

Patience Professors, patience…

Hello Higgsy

On 4 July 2012 science’s worst kept secret was publicly announced at CERN, Geneva. Spokespersons of the ATLAS and CMS detectors announced that they had observed a ‘Higgs-like particle’ at 126 GeV and CERN’s Director General, Rolfe-Dieter Heur, declared – “I think we have it”. Peter Higgs sat in the room and shed a tear of joy. He also met a chap called Francois Englert for the first time that day.

And the winner is…



So it took almost 50 years, a $10 billion machine and the input of thousands upon thousands of scientists, engineers and mathematicians, but technology caught up with theory and proved Francois Englert and Peter Higgs right. There may be some grumblings that the observation of the Higgs boson also deserved the recognition of the Nobel Committee, but I think no one would begrudge these two extraordinary men science’s ultimate accolade. It has certainly been a long time coming.

Well done chaps – you thoroughly deserve it!


Since the Higgs boson’s discovery a little over a year ago at CERN I have been getting a lot of questions from my friends to explain to them “what this Higgs thing does.” So I often tried to use the crowd analogy that is ascribed to Prof. David Miller, to describe the Higgs (or Englert-Brout-Higgs-Guralnik-Hagen-Kibble) mechanism. Interestingly enough, it did not work well for most of my old school friends, majority of whom happen to pursue careers in engineering. So I thought that perhaps another analogy would be more appropriate. Here it is, please let me know what you think!

Imagine Higgs field as represented by some quantity of slightly magnetized iron filings, i.e. small pieces of iron that look like powder, spread over a table or other surface to represent Higgs field that permeates the Universe. Iron filings are common not only as dirt in metal shops, they are often used in school experiments and other science demonstrations to visualize the magnetic field.  It is important for them to be slightly magnetized, as this represents self-interaction of the Higgs field. Here they are pictured in a somewhat cartoonish way:


How can Higgs field generate mass? Moreover, how can one field generate different masses for different types of particles? Let us first make an analogue of fermion mass generation. If we take a small magnet and put it in the filings, the magnet would pick up a bunch of filings, right? How much would it pick up? It depends on the “strength” of that magnet. It could be a little:


…or it could be a lot, depending on what kind of magnet we use — or how strong it is:


If we neglect the masses of our magnets, as we assumed they are small, the mass of the picked up mess with the magnets inside is totally determined by the mass of the picked filings, which in turn is determined by the interaction strength between the magnets and the filings. This is precisely how fermion mass generation works in the Standard Model!

In the Standard Model the massless fermions are coupled to the Higgs field via so-called Yukawa interactions, whose strength is parametrized by a number, the Yukawa coupling constant. For different fermion types (or flavors) the couplings would be numerically different, ranging from one to one part in a million. As a result of interaction with the Higgs field (NOT the boson!) in the form of its vacuum expectation value, all fermions acquire masses (ok, maybe not all — neutrinos could be different). And those masses would depend on the strength of the interaction of fermions with Higgs field, just like in our example with magnets and iron filings!

Now imagine that we simply kicked the table! No magnets. The filings would clamp together to form lumps of filings. Each lump would have a mass, which would only depend on how strong the filings attract to each other (remember that they are slightly magnetized?). If we don’t know how strong they are magnetized, we cannot tell how massive each lamp will be, so we would have to measure their masses.


This gives a good analogy of the fact that Higgs boson is an excitation of the Higgs field (the fact that was pointed out by Higgs), and why we cannot predict its mass from the first principles, but need a direct observation at the LHC!

Notice that this picture (so far) does not provide direct analogy to how gauge bosons (W’s and Z bosons) receive their masses. W’s and Z are also initially massless because of the gauge (internal) symmetries required by the construction of the Standard Model. We did know their mass from earlier CERN and SLAC experiments — and even prior to those, we knew that W’s were massive from the fact that weak interactions are of the finite range.

To extend our analogy, let’s clean up the mess — literally! Let’s throw a bucket of water over the table covered with those iron filings and see what happens. Streams of water would pick up iron filings and flow from the table. Assuming that that water’s mass is negligible, the total mass of those water streams (aka dirty water) would be completely determined by the mass of picked iron filings, just like masses of W’s and Z are determined by the Higgs field.

This explanation seemed to work better for my engineering friends! What do you think?


Nobel Rivals

Monday, October 7th, 2013

In less than 24 hours the 2013 Nobel Prize for Physics will be announced and there seems to be one word on everyone’s lips… Higgs. The scientific community appears to hope en masse that Peter Higgs and Francois Englert collect a Nobel for their prediction of the Brout-Englert-Higgs mechanism, the mechanism by which elementary particles such as quarks and electrons acquire mass. There is however competition. Here are some of the other front-runners:

Iron-based superconductors – Hideo Hosono

Superconducting materials have zero electrical resistance when cooled below a critical temperature – a very useful property as it allows for the highly efficient transfer of electrical signals. Today’s most efficient superconductors incorporate a layer of superconducting copper oxide which must be cooled to very low temperatures to operate. For example, in the Large Hadron Collider (LHC) the operating temperature of the superconducting electromagnets, which are used to steer the beam of particles around the accelerator, is retained at around -271°C. The highest temperature at which today’s superconductors can operate is around -140°C, which is still fairly chilly.

A magnet levitating above a high-temperature superconductor, cooled with liquid nitrogen. (Image by Mai-Linh Doan.)

A magnet levitating above a high-temperature superconductor, cooled with liquid nitrogen. (Image by Mai-Linh Doan.)

The possibility of an iron-based superconductor, which could in theory have a much higher critical temperature, had previously been dismissed as it was assumed that the large magnetic moment of iron prevented the emergence of pairs of electrons known as ‘Cooper pairs’, which are required for superconductivity in conventional superconductors. However, in 2008 Hideo Hoson of the Tokyo Institute of Technology accidentally stumbled across the first iron-based layered superconductor. Iron-based devices may hold the key to the holy grail of superconductivity – room temperature superconductors – which would certainly make running the LHC at bit cheaper!

Discovery of extrasolar planets – Geoffrey Marcy, Michel Mayor and Didier Queloz 

In 1995 Michael Mayor and Didier Queloz of the University of Geneva announced the discovery of a massive exoplanet in orbit around the star 51 Pegasi, and they used a sneaky technique know as the radial velocity method to find it. The gravitational field of the orbiting planet causes 51 Pegasi to ‘wobble’ in its own small orbit. Using the Doppler effect, Mayor and Queloz were able to measure the variations in the radial velocity of 51 Pegasi, that is the speed at which the star was moving towards and away from Earth in its orbit, to infer the existence of a massive exoplanet. Clever!

Geoffrey Marcy of the University of California has discovered more exoplanets than anybody else, including 70 out of the first 100 identified. Indeed he was the first to verify the existence of Mayor and Queloz’s exoplanet around 51 Pegasi. He has also discovered the first transiting planet around another star, the first extrasolar planet beyond 5 AU, the first Neptune-sized exoplanets, and the first multiple planet system around a star similar to the Sun. He is the quintessential planet-hunter.

An artist's impression of an exoplanet.

An artist’s impression of an exoplanet.

And the winner is…

So tomorrow’s announcement is not a foregone conclusion. Profs Englert and Higgs, and any others hoping to be awarded a Higgs-related Nobel prize, have some stiff competition. Exoplanets have provided some of the buzziest science headlines in recent years, while iron-based superconductors could have hugely significant practical applications, including in particle physics.

However, one senses that it is perhaps the right time for science’s ultimate accolade to be awarded for the prediction and/or observation of the Higgs, which has undoubtedly been one of the most significant discoveries in science in the past century. And with Profs Englert and Higgs in their 80s and Nobels not awarded posthumously (which is why Robert Brout, who collaborated with Englert and died in 2011, cannot be awarded the prize), one suspects that the Nobel committee is somewhat feeling the pressure.

Stay tuned to Quantum Diaries tomorrow where we’ll be blogging live to keep you informed of the latest developments in the run-up to and following the prize announcements. I will also be doing a bit of tweeting @JimmyDocco. And there is already an excellent blog post up in anticipation of the big announcement here.


In a paper published in the journal Nature, the CLOUD experiment at CERN reports on a major advance towards solving a long-standing enigma in climate science: how do aerosol particles form in the atmosphere? It is known that all cloud droplets form on aerosols: tiny solid or liquid particles suspended in the air. However, how these aerosol particles form or “nucleate” from atmospheric trace gases – and which gases are responsible – has remained a mystery.

According to the Intergovernmental Panel on Climate Change (IPCC), aerosol particles and their influence on clouds constitute the biggest uncertainty in assessing human-induced climate change. Understanding how aerosol particles form in the atmosphere is important since in increased concentrations, they cool the planet by reflecting more sunlight and by forming smaller but more numerous cloud droplets. That, in turn, makes clouds more reflective and extends their lifetimes.  These poorly-understood processes currently limit the precision of climate projections for the 21st century.

Thanks to CERN expertise in materials, gas systems and ultra-high vacuum technologies, the CLOUD team was able to build a chamber with unprecedented cleanliness. This enabled them to introduce minute amounts of various atmospheric vapours into an initially “pure” atmosphere under carefully controlled conditions, and start unravelling the mystery.

The researchers made two key discoveries. Firstly, they found that minute concentrations of amines can combine with sulphuric acid to form aerosol particles at rates similar to atmospheric observations. Secondly, using a pion beam from the CERN Proton Synchrotron, they found that cosmic radiation has a negligible influence on the formation rates of these particular aerosol particles.


This detailed plot shows the nucleation rate (i.e. the rate at which aerosol particles form) against sulphuric acid concentration. The small coloured squares in the background show atmospheric observations. The CLOUD measurements (large symbols) were obtained with various vapours in the chamber (curve 1: only sulphuric acid and water; curve 2: with ammonia added; curve 3 to 5: with amines added). The dashed lines and coloured bands show the theoretical expectations for ammonia+sulphuric acid (blue) and amine+sulphuric acid (red/orange) nucleation, based on quantum chemical calculations. Only amines reproduced the nucleation rates observed in the atmosphere, while ammonia was a thousand times too small.

Amines are atmospheric vapours closely related to ammonia, arising from human activities such as animal farming and also from natural sources. Amines are responsible for the familiar odours emanating from the decomposition of organic matter that contains proteins. For example, the smell of rotten fish is due to trimethylamine.

Thanks to their unique ultra-clean chamber, the CLOUD scientists have shown for the first time that the extremely low concentrations of amines typically found in the atmosphere (namely a few parts per trillion by volume or pptv) are sufficient to combine with sulphuric acid to form highly stable aerosol particles at rates similar to those observed in the lower atmosphere, as shown on the figure above.

JasperJasper Kirkby, spokesperson of the CLOUD experiment, crouched inside the ultra-clean chamber used for these measurements.

The precise laboratory measurements have allowed the team to develop a fundamental understanding of the nucleation process at a molecular level. The scientists can even reproduce their experimental results using quantum chemical calculations of molecular clustering.

This is the first time an experiment has reproduced the formation rates of atmospheric particles with complete measurements of the participating molecules. So the CLOUD results represent a major advance in our understanding of atmospheric nucleation.

Nobody expected that the formation rate of aerosol particles in the lower atmosphere would be so sensitive to amines. A large fraction of amines arise from human activities, but they have not been considered so far by the IPCC in their climate assessments. The CLOUD experiment has therefore revealed an important new mechanism that could contribute to a presently unaccounted cooling effect.

Moreover, a technique called “amine scrubbing” is likely to become the dominant technology to capture the carbon dioxide emitted by fossil-fuel power plants. Hence, amine emissions are expected to increase in the future and will now need to be considered when assessing the impact of human activities on past and future climate.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.



Dans un article publié dans la revue Nature, l’expérience CLOUD du CERN révèle une avancée majeure vers la résolution d’une énigme de longue date en science du climat: comment les particules d’aérosols se forment-elles dans l’atmosphère? On sait que toutes les gouttelettes au sein des nuages se forment à partir d’aérosols, les minuscules particules solides ou liquides en suspension dans l’air. Mais comment ces aérosols se forment, leur “nucléation” à partir de traces de gaz dans l’atmosphère, et quels gaz contribuaient au processus, demeuraient un mystère.

Selon le Groupe d’experts intergouvernemental sur l’évolution du climat (GIEC), les particules d’aérosols et leur influence sur les nuages constituent l’inconnu majeur dans l’évaluation du changement climatique d’origine humaine. Comprendre comment les particules d’aérosols se forment dans l’atmosphère est donc essentiel. En plus grande concentration, les aérosols contribuent au refroidissement de la planète car ils réfléchissent davantage les rayons solaires et forment de plus petites mais plus nombreuses gouttelettes nuageuses. Tout cela fait des nuages plus réfléchissants et prolonge leur durée de vie. Ces processus étaient mal compris et limitaient actuellement la précision des projections climatiques pour le 21e siècle.

Grâce à l’expertise du CERN en science des matériaux, systèmes de gaz et technologie de l’ultra-vide, l’équipe de CLOUD a pu construire une chambre d’une propreté sans précédent. Cela leur a permis d’introduire des quantités infimes de diverses vapeurs atmosphériques dans une atmosphère d’abord «pure» et dans des conditions soigneusement contrôlées, et commencer ainsi à percer le mystère.

Les chercheur-e-s ont fait deux découvertes clés. Tout d’abord, que des concentrations infimes d’amines peuvent se combiner avec de l’acide sulfurique pour former des particules d’aérosols, et ce à des taux similaires aux observations atmosphériques. Deuxièmement, en utilisant un faisceau de pions du Synchrotron à protons du CERN, ils et elles ont démontré que le rayonnement cosmique a une influence négligeable sur les taux de formation de ces particules d’aérosols.

Le graphe détaillé ci-dessus montre le taux de nucléation (c’est-à-dire la vitesse à laquelle les particules d’aérosols se forment) en fonction de la concentration en acide sulfurique. Les petits carrés de couleur en arrière-plan donnent les valeurs qu’on retrouve dans l’atmosphère. Les mesures de CLOUD (grands symboles) ont été obtenues en introduisant différentes vapeurs dans la chambre (courbe 1: seulement de l’acide sulfurique et de l’eau, courbe 2: avec ajout d’ammoniaque; courbes 3 à 5: avec ajout d’amines). Les lignes en pointillés et les bandes de couleur indiquent les taux de nucléation théoriques pour ammoniaque + acide sulfurique (bleu) et amine + acide sulfurique (rouge / orange), basés sur des calculs de chimie quantique. Seuls les amines reproduisent les taux de nucléation observés dans l’atmosphère, tandis que l’ammoniaque donne un taux mille fois trop petit.

Les amines sont des vapeurs atmosphériques étroitement liées à l’ammoniaque qui proviennent d’activités humaines telles que l’élevage mais aussi de sources naturelles. Les amines sont responsables des odeurs associées à la décomposition de matière organique contenant des protéines. Par exemple, l’odeur de poisson pourri est due à la triméthylamine.

Grâce à leur chambre ultra-propre unique, les scientifiques de CLOUD ont montré pour la toute première fois que des concentrations extrêmement faibles d’amines telles qu’on en trouve généralement dans l’atmosphère (à savoir quelques parties par billion en volume ou pptv) suffisent. Les amines peuvent alors se combiner avec de l’acide sulfurique et former des particules d’aérosols très stables à des taux similaires à ceux observés dans la basse atmosphère, comme l’indique le graphe ci-dessus.


Jasper Kirkby, porte-parole de l’expérience CLOUD accroupi à l’intérieur de la chambre unltra-propre utilisée pour ces travaux.

Ces mesures précises en laboratoire ont permis à l’équipe de CLOUD de comprendre de façon fondamentale le processus de nucléation au niveau moléculaire. Les scientifiques peuvent même reproduire leurs résultats expérimentaux en utilisant des calculs de chimie quantique de nucléation moléculaire.

C’est la première fois qu’une expérience a pu reproduire les taux de formation des particules atmosphériques en connaissant parfaitement les molécules participantes. Ainsi, les résultats de CLOUD représentent une avancée majeure dans notre compréhension de la nucléation atmosphérique.

Personne ne s’attendait à ce que le taux de formation des particules d’aérosol dans la basse atmosphère soit si sensible aux amines. Une grande partie des amines résultent de l’activité humaine, mais elles n’ont pas été considérées jusqu’ici par le GIEC dans leur évaluation du climat. L’expérience CLOUD a donc révélé un nouveau mécanisme important qui pourrait contribuer à un effet de refroidissement mais qui avait été négligé jusqu’à présent.
Par ailleurs, une technique appelée « lavage aux amines » est susceptible de devenir la technologie dominante pour capturer le dioxyde de carbone émis par les centrales électriques à combustibles fossiles. Par conséquent, les émissions d’amines pourraient augmenter à l’avenir et devront maintenant être prises en compte lors de l’évaluation de l’impact des activités humaines sur le climat passé et futur.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution


Simplicity plays a crucial, but frequently overlooked, role in the scientific method (see the posters in my previous post). Considering how complicated science can be, simplicity may seem to be far from a driving source in science. Is string theory really simple? If scientists need at least six, seven or more years of training past high school, how can we consider science to be anything but antithetical to simplicity?

Good questions, but simple is relative. Consider the standard model of particle physics. First, it is widely agreed upon what the standard model is. Second, there are many alternatives to the standard model that agree with the standard model where there is experimental data but disagree elsewhere. One can name many[1]: Little Higgs, Technicolor, Grand Unified Models (in many varieties), and Super Symmetric Grand Unified Models (also in many varieties). I have even attended a seminar where the speaker gave a general technique to generate extensions to the standard model that also have a dark matter candidate. So why do we prefer the standard model? It is not elegance. Very few people consider the Standard Model more elegant than its competitors. Indeed, elegance is one of the main motivations driving the generation of alternate models. The competitors also keep all the phenomenological success of the standard model. So, to repeat the question, why do we prefer the standard model to the competitors? Simplicity and only simplicity. All the pretenders have additional assumptions or ingredients that are not required by the current experimental data. At some point they may be required as more data is made available but not now.  Thus we go with the simplest model that describes the data.

This is true across all disciplines and over time. The elliptic orbits of Kepler (1571–1630) where simpler than the epicycles of Ptolemy random graph(c. 90 – c. 168) or the epicyclets of Copernicus (1473–1543). There it is. We draw straight lines through the data rather than 29th order polynomials. If the data has bumps and wiggles, we frequently assume they are experimental error as in the randomly[2] chosen graph to the left where the theory lines do not go through all the data points. No one would take me seriously if I fit every single bump and wiggle. Simplicity is more important than religiously fitting each data point.

Going from the sublime to the ridiculous consider Russell’s teapot.  Bertrand Russell (1872–1970) argued as follows: If I were to suggest that between the Earth and Mars there is a china teapot revolving about the sun in an elliptical orbit, nobody would be able to disprove my assertion provided I were careful to add that the teapot is too small to be revealed even by our most powerful telescopes. But if I were to go on to say that, since my assertion cannot be disproved, it is intolerable presumption on the part of human reason to doubt it, I should rightly be thought to be talking nonsense. But what feature of the scientific method rules out the orbiting teapot? Or invisible pink unicorns? Or anyone of a thousand different mythical beings? Not observation! But they fail the simplicity test. Like the various extensions to the standard model, they are discounted because there are extra assumptions that are not required by the observational data.  This is otherwise known as Occam’s razor.

The argument for simplicity is rather straight forward. Models are judged by their ability to describe past observations and make correct predictions for future ones. As a matter of practical consideration, one should drop all features of a model that are not conducive to that end. While the next batch of data may force one to a more complicated model, there is no way to judge in advance which direction the complication will take. Hence we have all the extensions of the standard model waiting in the wings to see which, if any, the next batch of data will prefer – or rule out.

The crucial role of simplicity in choosing one model from among the many solves one of the enduring problems in the philosophy of science. Consider the following quote from Imre Lakatos (1922 – 1974) a leading philosopher of science from the last century: But, as many skeptics pointed out, rival theories are always indefinitely many and therefore the proving power of experiment vanishes.  One cannot learn from experience about the truth of any scientific theory, only at best about its falsehood: confirming instances have no epistemic value whatsoever (emphasis in the original). Note the premise of the argument: rival theories are always indefinitely many. While rival theories may be infinitely many, one or at most a very few are always chosen by the criteria of simplicity.  We have the one standard model of particle physics not an infinite many and his argument fails at the first step. Confirming instances, like finding the Higgs boson, do have epistemic value.

[1] This list is time dependent and may be out of date.

[2] Chosen randomly from one of my papers.


Nobel Dreams

Friday, October 4th, 2013

The liveblog

Greeting from Brussels! This is my liveblog of the Nobel Prize Announcement Ceremony, bringing you the facts and the retweets as they happen.

14:14: Press Conference ongoing. “This is a great day for young people.”

13:56: A moving statement from Kibble (source):

I am glad to see that the Swedish Academy has recognized the importance of the mass-generating mechanism for gauge theories and the prediction of the Higgs boson, recently verified at CERN. My two collaborators, Gerald Guralnik and Carl Richard Hagen, and I contributed to that discovery, but our paper was unquestionably the last of the three to be published in Physical Review Letters in 1964 (though we naturally regard our treatment as the most thorough and complete) and it is therefore no surprise that the Swedish Academy felt unable to include us, constrained as they are by a self-imposed rule that the Prize cannot be shared by more than three people. My sincere congratulations go to the two Prize winners, François Englert and Peter Higgs. A sad omission from the list was Englert’s collaborator Robert Brout, now deceased.

13:37: CERN are holding a press conference at 14:00 (CET) link

13:22: Commentary continues at the Nobel Prize page. Currently discussing why the boson was so hard to find. “This particle has been looked for at every accelerator that has existed.”

13:20: As expected, so many news sites have been created: CMS, ATLAS, ULB, Edinburgh

13:14: I think my twitter account has exploded with tweets. Also, some Belgian news pages are down, probably due to high traffic. Wow!

13:11: Wow, what a great announcement. Too short though!

13:08: Find out more about the physics at Brussels, where the Brout-Englert-Higgs mechanism was born! The IIHE and the Nobel Prize

13:01: Englert is on the phone. Good to hear from him 🙂

12:59: Animation of the boson appearing, cool!

12:57: We just opened the champagne here at ULB!

12:52: Text for the announcement:

“For the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider”

12:48: The award goes to Englert and Higgs!

12:44: One minute to go!

12:39: We all know what the Brout-Englert-Higgs mechanism is and what the boson discovery means, so let’s instead take a look at the other likely awards. The prize could go to the discovery of extra solar planets. 51 Pegasi b was an extra solar planet discovered in 1995, orbiting a sun-like star. This discovery could have far reaching implications. What would happen if we saw spectral lines suggesting the presence amino acids coming from the planet? (I’m not sure such a phenomenon is even possible, but if it is it would be a very strong indicator of RNA-like life from another planet.) That discovery took place 18 years ago, and the Brout-Englert-Higgs boson was discovered only one year ago. Either discovery would certainly be worthy of the prize.

12:33: A quantum approach to the delay problem:

Someone go observe the academy and make them leave this terrible superposition. (@lievenscheire)

12:32: Another possible reason for the delay:

There’ll be a new hunt for the #Higgs. He’s gone to the Highlands to avoid the fuss if he wins #nobelprize. Maybe reason for delay. (@BBCPallab)

12:31: The Nobel Prize committee are stalling by suggesting we look at previous awards. At least they are trying to keep up amused while we wait 🙂

12:29: Around the world people are patiently waiting. People from the US have been awake since 5:00am. In Marakech the ATLAS Collaboration looks on. Here are ULB/IIHE the cafeteria seem deserted. (I’m glad there’s a coffee machine on the desk next time mine.) I’m starting to think this is a plot to get some more media attention for what is bound to be a controversial year for physics. There are many good choices of topic this year, and even some of the topics have controversial choices of Laureates.

12:21: Some humourous speculation about the delay:

The Academy only has 3 #sigma evidence of more votes for than against, waiting for more data (@SethZenz)

They can’t get Comic Sans installed on the Academy’s computer (@orzelc)

The committee were mobbed trying to get across a cocktail party. (@AstroKatie)

12:07: The announcement is delayed until 12:45 CET. People are complaining about the background music!

11:58: The announcement is delayed until 12:30 CET.

11:44: According to the Guardian (source) there will be a delay of 30 minutes.

11:42: Just over two minutes to go. This could be a very exciting year for Belgium.

11:33: See the livecast.

Other info

On Tuesday October 8th the recipient(s) of the 2013 Nobel Prize in Physics will be announced. There has already been a lot of speculation about who might be the Nobel Laureates this year, and there is a lot of interest in the likely contenders! Each year Thomson Reuters publishes predictions of who might receive the Nobel Prizes, and this year they have narrowed the scope down to three likely awards in physics:

  • ‣ Francois Englert and Peter Higgs, for their prediction of the Brout-Englert-Higgs mechanism. (Brout is deceased and the Nobel Prize is not awarded posthumously.)
  • ‣ Hideo Hosono, for his discovery of iron-based superconductors.
  • ‣ Geoffrey Marcy, Michel Mayor, and Didier Queloz, for their discoveries of extrasolar planets.
The 2012 Nobel Prize Award Ceremony (Copyright © Nobel Media AB 2012 Photo: Alexander Mahmoud)

The 2012 Nobel Prize Award Ceremony (Copyright © Nobel Media AB 2012 Photo: Alexander Mahmoud)

There has also been speculation that either Anderson or Nambu may receive a second Nobel Prize for their work related to spontaneous symmetry breaking.

With so many different predictions and so many opinions it can be hard to keep up with all the latest news and blogs! I know that a lot of people plan to share their views and experiences of the day, so I’ll be keep a list of bloggers and tweeters that you can follow.

Seth Zenz:

See Seth’s excellent post about the Nobel Prize, Englert and Higgs, and CERN. You can also follow his twitter account: @SethZenz

James Doherty:

See James’s great post about the Nobel Prize, He’s on twitter too: @JimmyDocco

Guardian liveblog

Other twitter accounts to follow:






This article originally appeared in symmetry on Sept. 30, 2013.

Millions around the world, both scientists and non-scientists, use Scientific Linux, an operating system developed for particle physics. Photo: Reidar Hahn

Millions around the world, both scientists and non-scientists, use Scientific Linux, an operating system developed for particle physics. Photo: Reidar Hahn

When a handful of developers at Fermilab modified a computer operating system for use in particle physics, they had no idea their creation would eventually be used by millions inside and outside of science.

Today’s version of the system, called Scientific Linux, runs on computers around the world: at top universities, national laboratories and even in low Earth orbit on the International Space Station. An alternative to Windows or Mac, it has attracted the attention of people from a variety of fields. For example, at the University of Wisconsin at Madison, where the majority of the campus grid is running Scientific Linux, students in fields as diverse as statistics, chemical engineering, economics and avian research use the operating system.

Lauren Michael, a research computing facilitator at UW-Madison’s Center for High Throughput Computing, calls Scientific Linux a powerful tool “enabling researchers from all disciplines.”

When Fermilab Lead Scientific Linux Developer Connie Sieh started the development of the first iteration of the system in 1997, though, she was just looking for cheaper hardware.

In the early 1990s, Fermilab scientists used proprietary operating systems from companies like IBM and SGI, Sieh says. But in 1997, as personal computers became more commonplace, Linux and other free operating systems did, too—for everyday people and, especially, scientists.

So when a computing-heavy project came up at Fermilab, Sieh opted to replace the more expensive IBM and SGI hardware and the software that came with those machines. The new software she decided on was a version of Linux distributed by software company RedHat Inc., mostly because it was free and had the option to be installed in batches, which would save a ton of time. At the same time, RedHat’s Linux was simple enough for scientists to install at their desktops on their own. The computing project, running on Linux, was successful, so the laboratory kept using it.

In 1998, Fermilab released a product called FermiLinux, tailored to fit the lab’s needs.

It was possible to modify the operating system only because, in addition to being free, RedHat’s Linux comes with its source code fully included. This would be a little like a car company supplying detailed blueprints of its cars to every customer and its competitors. Open-source software allows customers to customize a product to meet their exact specifications.

“They go above and beyond what they have to do, as far as releasing the source code,” Sieh says.

Fermilab continued to use FermiLinux until 2003, when RedHat announced that it would start charging money for its product. It took only about a week for Fermilab to use the source code from RedHat’s no-longer-free product to get its own, freely accessible version up and running—what would become Scientific Linux.

By early 2004, a collaboration of developers from Fermilab, CERN and a few other labs released Scientific Linux for the entire high-energy physics community to use. That operating system is the same one that millions of scientists and non-scientists use, free of charge, to this day.

Whenever RedHat releases an update, about once every six months, Fermilab purchases it, and the lab’s tiny team of developers—currently, just Fermilab’s Sieh, Pat Riehecky and Bonnie King—work in overdrive to get their version out soon after, adding tools and customizations they think will be useful.

Aside from big users like the national labs, Sieh says, about 140,000 others run Scientific Linux. And, of course, the program is still widely used in the field it was first meant to serve. Its global presence ensures some consistency and unity across many large institutions.

Alec Habig, a physics professor at the University of Minnesota, Duluth, says when his students visit other institutions to do research, “they know what they’re doing already,” having become familiar with the operating system at the university.

“It’s a good tool for the job,” he says. “It helps our students get a leg-up on the research.”

Sarah Witman


Open Days and sore throats

Wednesday, October 2nd, 2013

Many people have a sore throat this week at CERN. Not too surprising given the 70 000 inquisitive visitors we welcomed over the weekend! It was amazing to see so much interest from the public and the enthusiasm of the 2300 volunteers.


Everybody pitched in. From the fire brigade to the experiments, every one was showing their part of the lab. Accelerator specialists and physicists, administrative and support staff, everybody was proudly wearing their bright orange T-shirt.


Thanks to the ticketing system for the underground visits, the queuing time was reduced with respect to the previous event in 2008. It was not easy to accommodate every one who wanted to see the large detectors sitting 100 meters underground, the elevators capacity being the limiting factor. Nevertheless, 20 000 people were taken down in small groups to one of the underground visit points

 Visitors to the ALICE experiment.

But there was loads of action taking place at the surface too. I was at the ATLAS stand on Saturday morning to answer questions about the lab and the various research activities. I met people who had come from the Czech Republic, Sweden, Lithuania, Poland, Algeria, USA, Scotland, Spain and even Australia just to have the opportunity to explore the world’s largest particle physics laboratory.

 Visitor playing with robotic machine in metrology lab.

The volunteers were also happily grabbing the opportunity to discover areas they had never visited before. This was certainly my case and I zipped up and down the road trying to peek at as many sites as possible between my assignments.


I was particularly impressed by the enthusiasm displayed by the machinists in the huge workshop adjacent to my office. I pass by this every morning but had never had a chance to see the mind-boggling pieces and machinery the team had on display. All sorts of round objects blown out of metal or milled, some with puzzling shape, geometry or size.


There was much entertainment for the young and not-so-young crowd too. The crane operators had them go up in tall cherry pickers or lift huge weights from a joy-stick box. Fire fighters had spectacular burning displays or staging rescue operations underground.

The cryogenics department had its popular liquid nitrogen stand, with the superconducting levitating scooter as one of the main attractions.


As I was making my way back to my car shortly before 8 pm on Sunday, I noticed one of my young colleagues was still enthusiastically explaining the working of a dipole magnet. He had started at 8 am the previous day and he was still displaying the same passion although with slightly less voice!

Many lectures given during the weekend will be available shortly.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.