• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for December, 2011

On the edge of the icepack

Tuesday, December 20th, 2011

Now that we might (maybe, possibly, could be, it could go away, let’s be careful about what we say here lest we put a jinx on it…) be seeing hints of a Higgs, it’s time of some cautionary tales that a ‘discovery’ is not the end of the story, it’s only the beginning.

When I was a young graduate student, Martinus Veltman gave a talk at a summer school. He had yet to share the Nobel Prize in physics with Gerard t’Hooft for work on the Standard Model. This was 1980. Veltman said, “Right now, theorists are in the driver’s seat. In fact, for the next 30 years, with the exception of the details of the masses of some particles, we know what’s going to happen. But in 30 years, we absolutely are going to need experimental guidance to make any progress in particle physics whatsoever.”

In point of fact, I don’t think anyone recorded Veltman’s words for posterity, but it made a deep impression on me in a number of ways. First and foremost to a rookie experimentalist was the realization that I had to toil in the vineyards for thirty years until something truly of note would arrive. The second thought was “What on earth is he talking about?” We were handed the Higgs boson as an article of faith.

In the current publicity about the Higgs boson, I often worry that we simplify the goals of what we’re doing to the point of trivializing it. Veltman was right: What we call the “Higgs boson” or the “God particle” is really a surrogate for a strange mechanism that bestows mass on all particles through an interaction. We have the most precise theory of anything and, yet, this absolutely crucial piece is missing. We’ve gotten to that point, 30 years later, at which we’ve found almost all of the particles theorists predicted as part of the Standard Model and are now heading into uncharted territory in higher energy ranges.

What we don’t often talk about are the odd properties that this mysterious particle is thought to have. Often times, theorists, like Veltman, feel that the current model is horribly inelegant and must therefore be completely wrong and only a pale approximation of what nature really does to bestow mass, the most elemental of properties a particle can possess.

What makes it so inelegant? In the first place, if it is as described in our simplest model, it would be the only elementary particle we know of that doesn’t possess an intrinsic property of ‘spin’, which seems to be key in the workings of our theory of fundamental forces. For more on the inherently quantum mechanical property of particles known as spin, see this post by Flip Tanedo.

Beyond its spin 0, the Higgs has a very odd property: It gives energy to the vacuum of space. We don’t really know what this means and often just ignore it. Yet, a kind of vacuum energy has been invoked to describe cosmology on very different energy scales from the energies we’re exploring at the LHC. Astrophysicists often talk about something called the “flatness problem”.

If you took a pot of boiling water just off the stove, and dumped ice into it, eventually you would see the ice and the hot water come to some equilibrium temperature. But, in order for this to happen, the ice and hot water have to physically come in contact with each other. When we look around the universe, the temperature of everything is the same to a remarkable degree, as if it was all sitting in the same pot and came to the same temperature. That would all be well and good, but one patch of the universe cannot possibly have been in contact with another part, because they’re separated by such a large distance that light itself cannot connect the two. How could the entire universe be at the same temperature?

The answer is largely thought to lie in a period called ‘inflation’. Initially the very early universe was so dense and compact that temperatures from one part could communicate to another part: everything was sitting in the same ‘pot’. Then, a mysterious vacuum energy appeared that pushed parts of the universe out of contact, but preserved the uniformity of temperature. This happened within a very early phase of the universe when temperatures and energy densities were far hotter than the conditions we’re producing at the LHC. This vacuum energy is about a trillion times larger than what we associate with the Higgs.

Astrophysicists have also invoked a vacuum energy at another, much weaker scale. You may have heard of ‘dark energy’. Our best guess is that, like the vacuum energy of the early universe, this mysterious force that seems to be pushing the universe apart also seems to be a kind of vacuum energy. Yet, in this case the energy of the vacuum is exceedingly weaker than the energies we’re exploring at the LHC. So, there’s a vacuum energy invoked to explain both the very early universe and the very late universe. At the same time, there’s a vacuum energy associated with the Higgs, but it just sits there like an orphan, of no consequence.

To deal with some of these strange properties, theorists have come up with other ideas for how the Higgs might manifest itself:

1.) Supersymmetric Higgs – The energy scale where the three main forces other than gravity, the strong, weak and electromagnetic- join together is close to the scale associated with the cosmic inflation. This is often called the ‘Grand Unification scale.’ The fact that we see two of the fundamental forces – weak and electromagnetic – joining together at the LHC energies presents a conundrum. It is very difficult to reconcile the Grand Unification scale with the LHC scale in a natural way without having some other kinds of matter arise. The constants of the theory would have to line up just perfectly, fine-tuned to a level of precision that is equivalent to balancing a pencil on its point. With Supersymmetry, a number of Higgs-like particles arise.

2.) Composite Higgs – Rather than deal with an inelegant particle with no spin, theorists have speculated that it’s actually made of multiple objects, possible pair of top quarks, tightly bound together. The opposite spins of the objects bound together in a composite Higgs would cancel out to give it zero spin.

3.) No Higgs –According to some models, the Higgs is not a particle at all, but the result of interactions that create mass. These models are sometimes called ‘technicolor’. Although they aren’t particularly favored by theorists because they’re difficult to calculate, we cannot rule them out.

Experimentalists are checking the data for all of these possibilities.

But, what if something like our vanilla-Higgs shows up with a high degree of certainty? Are we done? Hardly! Given all the possibilities and the somewhat inelegant nature of the vanilla-Higgs model, the work has just begun. We have to ask questions like: Is there only one? What is its spin? How does it interact with all the other particles? Are there any variations in its interactions from what we expect, and if so, how does that relate to other measurements we do. These are the tough questions, the one Veltman was alluding to and my betting odds are that we’ll find deviations from our vanilla-Higgs, but it won’t be easy. It may take a decade or more of data taking at the highest beam intensities and energies before we begin to understand what’s really going on.

Science may begin with blinders and theories may run aground, but eventually we do manage to figure out what’s going on.

Here’s a cautionary tale from the 19th century. It illustrates how people can be steered in the direction of one theory, but ultimately can end up with a far more powerful idea.

A German geographer named August Petermann championed a theory of a warm Polar sea. Some expeditions to the high Arctic reported seeing vast stretches of ice-free water extending off toward the horizon. An oceanographer named Silas Bent speculated the that warmth of the Gulf Stream waters flowing north, combined with the waters of a similar ocean current, called the Kurosiwa (black current) flowing off of the coast of Japan would be sufficient to warm the Polar Ocean to the point that an expedition, if it could make it through some part of the ice pack, could sail directly to the Pole. Petermann was one of the main champions of the idea.

James Gordon Bennett Jr. was the publisher of the New York Herald and tried to boost publication by underwriting adventurous expeditions. He financed Henry Morton’s Stanley’s search for David Livingston, garnering a boost in the circulation of the Herald. Hearing of Petermann’s theory of the warm polar sea, Bennett set about to finance an expedition and purchased a British gunboat, the HMS Pandora, and refitted it. He enlisted the US Navy to find a crew. Rechristened the USS Jeanette, it was captained by Lieutenant Commander George DeLong. Hoping to repeat the publicity of the famous Stanley-Livingston meeting, Bennett sent the Jeanette north through the Bering Strait in hopes of reaching the famed open Polar Sea. The Jeanette left San Francisco in July 1879, and was last heard from in late August of that year.

After crossing the Bering Strait, the Jeanette was soon frozen fast in the icepack. Trapped there for nearly two years, it slowly drifted northwest from the coast of Siberia and was ultimately crushed by the icepack. DeLong ordered his crew to abandon ship and began a trek over the frozen icepack, hauling three lifeboats in hopes of eventually reaching settlements along the delta of the Lena River in Siberia. DeLong didn’t make it out alive, perishing in the maze of channels. Some survivors did make it to settlements and eventually made it back home.

Three years later, the wreckage of the Jeanette washed up on the coast of Greenland, some three thousand miles away. This prompted many to wonder how the wreckage could travel so far across the frozen icecap. Theories about ocean current proliferated. One adventurer, Fridtjof Nansen, constructed a polar exploration vessel, the Fram. Fram had a rounded hull that allowed it to be frozen into the icepack without being crushed. Nansen and crew sailed to roughly the point where the Jeanette had been frozen in and commenced a drift across the Polar Sea. At this point, the theory of the Open Polar Sea was completely abandoned in the face of the overwhelming data to the contrary.

Although Nansen never reached the North Pole, during the Fram’s expedition, the remaining crew made detailed observations of wind patterns, drift, the ocean depths and temperatures. On its return to Norway, the Fram had a wealth of data that took years to sift through. Vagn Ekman was a student in physics at the University of Uppsala, Sweden. He was studying fluid dynamics and heard of the data from the Fram. After exploring the mathematics of the interactions of air and water flow on the surface of the rotating earth, he developed the modern theory of surface ocean currents, which bears his name: Ekman transport.

Ekman’s work remains one of the fundamental underpinnings of oceanography.

What I’m trying to point out is this: We are on a voyage of discovery. As Veltman said, experimentalists are really now the ones in the drivers seat. The vanilla-Higgs is an easy target to fire at, as there are quite specific predictions for how it will be manifested, but there are good reasons to be suspicious that the Higgs is precisely as it is described in the simplest version of the Standard Model. Like the long, meandering progression from the theory of the Open Polar Sea to the modern theory of Ocean Currents, I suspect that we’ll have many changes and false leads. As it stands now, with the performance of the LHC, we are just beginning to penetrate the icepack, and we don’t really know what to expect.

A rendering of the long retreat to the Lena River Delta by the DeLong Expedition


Un Américain à Paris

Tuesday, December 20th, 2011

Les Parisiens attendent patiemment sous la pluie le 17 décembre 2011, deux heures avant d'assister à la conférence de Saul Perlmutter, Prix Nobel de physique 2011.

Qu’est-ce qui peut pousser les Parisiens à patienter dans le froid un samedi 17 décembre, plutôt que de courir les magasins de Noël ? Simplement la crainte qu’il n’y ait pas assez de place pour assister à la toute première conférence de l’Américain Saul Perlmutter, après avoir reçu son Nobel de physique à Stockholm. En cette année 2011, Saul vient en effet de voir son travail récompensé de la plus haute distinction qu’il partage avec Adam Riess et Brian P. Schmidt “pour la découverte de l’expansion accélérée de l’Univers”.

Si Saul a fait un arrêt par Paris avant de reprendre un avion pour rentrer chez lui, ce n’est pas par hasard. Car des chercheurs de l’IN2P3 au Laboratoire de Physique Nucléaire et de Hautes Energies (LPNHE), travaillent à ses côtés depuis de nombreuses années, engagés dans les programmes de mesure en cosmologie à l’aide des supernovae de type Ia, à l’origine de cette découverte.

C’est donc tout naturellement qu’il a répondu à l’invitation de son collègue Reynald Pain – actuel Directeur du LPNHE et co-signataire de l’article phare du Nobel, acceptant de donner à la fois un séminaire scientifique ce vendredi 16 décembre à l’Université Pierre et Marie Curie, et une conférence grand public à l’amphithéâtre des Cordeliers samedi 17, au coeur du quartier latin. Et c’était la foule des grands jours dans ce haut-lieu historique de Paris. De 600 à 700 personnes se sont entassées dans la salle de 470 places réservée pour l’occasion, dont une centaine assises dans les escaliers, et quelques dizaines restées debout ou assises par terre, là où elles ont pu trouver un petit bout de place !

La foule s'installe. 30 minutes avant le début de la conférence, la salle est déjà presque pleine.

Au menu de cette conférence-événement, l’accélération de l’expansion de l’Univers évidemment, mais aussi l’énergie noire, cette mystérieuse substance “répulsive” qui pourrait expliquer l’accélération en question. Saul est revenu en détail sur l’ensemble de cette aventure qui a conduit à un résultat totalement inattendu… et qui reste largement inexpliqué de nos jours.

Saul Perlmutter, juste avant sa conférence à Paris, le 17 décembre 2011.

Notre physique est souvent jugée trop compliquée à vulgariser, tant et si bien qu’il est parfois difficile de convaincre que l’on peut organiser une conférence, une exposition ou tout autre effort pour partager les mystères de la nature avec un large public. S’il fallait une preuve que le public est avide de connaissance, cette conférence sera au moins là pour attester que les sciences dures peuvent rassembler elles aussi un public très mélangé et de tous âges. Ce public là n’a pas couru les magasins de Noël ce samedi 17 décembre, car son cadeau à lui, c’était de rencontrer Saul Perlmutter. Merci à lui pour ce beau cadeau de Noël offert à nos concitoyens.

Arnaud Marsollier,
responsable de la communication de l’IN2P3

Tous nos remerciements à JP. Martin pour ses photos. Il est possible de lire son compte-rendu de la conférence sur le site: planetastronomy


Visiting Vietnam

Monday, December 19th, 2011


ここはベトナムのQui Nhonという街で、日本から一日半かけてやって来た.Elastic and Diffractive Scattering という会議で、そもそも僕の研究からはちょっと遠いのだが、non-perturbative QCDと言えば僕の研究分野な訳で、それでオーガナイザーの方に呼んでもらって、端っこで座っている.実験の話が半分以上なので、基礎的な言葉がわからず、また分かるつもりも薄く、何とものんびりと興味のある講演だけ聞きに行くという贅沢な時間の過ごし方をしている.スケジュールの中には理論セッションもあり、そこは楽しい.明日はchairをするし、明後日は自分の講演がある.


Qui Nhonの現地の最寄り空港には、ホテルの歓迎が待ち受けていた.花の首飾りをかけてもらったの、初めて.で、バスに乗り込んでホテルに向かうのだが、中途に通る村があまりに前近代的なので驚く.電気もろくに通っていないような暗い村に、笠地蔵の話に出てくるような笠をかぶった現地の人々が、自分の背の高さより高い荷物を載せて、バイクや自転車で往来する.トタン屋根に、水たまりばかりの泥だらけの道.ベトナム戦争のアメリカ映画で見たような村の光景がそこには広がっていた.ほんまはアジアはこんなところがほとんどなんや、という、多分当たり前の事実が眼前に展開されて、言葉が無かった.広がる水田やその向こうの山々は見慣れた風景ではあったが、日本の僕の知る田園風景との違いがよけいに際立って見えた.






The Week of Higgs

Sunday, December 18th, 2011

It has been an exciting week at CERN. Of course rumors had been flying for a while already. And even though most people in the theory division are not directly related to the experiments, people know people and information was passed around.

Certainly, I wasn’t going to miss the announcement of the results. To make sure, I positioned myself strategically in the auditorium already at 10 am, four hours before the seminar, armed with food, drink, and my iPad. At 10 am, the auditorium was already half full. And I wasn’t the only one from the theory group.

As time passed, more and more people were pouring in, many people who were part of either ATLAS or CMS had come in from their home institutions especially for this occasion. Security was stepped up and only people carrying their access batches were let in. By noon, the room had already gone past full capacity. People were sitting on the floor everywhere and security people did not let anyone enter anymore. Outside the doors, a disappointed crowd was forming. People who thought arriving two hours in advance was enough were in for a surprise.

Even though it was hard to concentrate on work waiting in the lecture hall, it was fun being part of this excited buzzing crowd, all waiting eagerly for the announcement. When at the beginning of the seminar, Rolf Heuer thanked the whole LHC team for their great work, the crowd broke into a huge round of roaring applause.

There’s no need reiterating the results here another time. There’s not enough data to call it a discovery yet, but we’re all confident that it’s just a matter of time until the discovery can be announced. I’m not sure whether on that day, there will be as much excitement in the air as on the day of the first announcement this week. It is a day worth remembering and I am happy to have been part of it, to have been right here at CERN. And I am looking forward to next year to see events unfolding further at LHC.


The development of science is often portrayed as a conflict between science and religion, between the natural and the supernatural. But it was equally, if not more so, a conflict with Aristotelian concepts: a change from Aristotle’s emphasis on why to a dominant role for how. To become the mainstream, science had to overcome resistance, first and foremost, from the academic establishment and only secondarily from the church. The former, represented by the disciples of Aristotle and the scholastic tradition, was at least as vociferous in condemning Galileo as the latter. Galileo, starting from when he was a student and for most of his career, was in conflict with the natural philosophers. (I decline to call them scientists.) His conflict with the church was mostly towards the end of his career, after he was fifty and more seriously when he was nearing seventy. The church itself even relied on the opinions of the natural philosophers to justify condemning the idea the earth moved. In the end science and Galileo’s successors won out and Aristotle’s natural philosophy was vanquished: the stationary earth, the perfect heavens (circular planetary orbits and perfectly spherical planets), nature abhorring a vacuum, the prime mover and so on.  For most of these it is so long and good riddance. So why do philosophers still spend so much time studying Aristotle? I really don’t know.

However, Aristotle did have a few good ideas whose loss is unfortunate. The baby was thrown out with the bath water, so to speak. One such concept, although much abused, is the classification of causes given by Aristotle. The four types of causes he identified are the formal, material, effective and final causes.  He believed that these four causes were necessary and sufficient to explain any phenomena. The formal cause is the plan, the material cause is what it is made of, the effective cause is the “how”, and the final cause is the “why”. If you think in terms of building a house the formal cause is the blueprint, the material cause is what it is built of (the wood, brick, glass, etc.), the effective causes are the carpenters and their tools (are hammers obsolete?) and the final cause is the purpose the house was built for.

Aristotle and his medieval followers emphasized the final cause and pure thought. Science became established only by breaking away from the final cause and the tyranny of “why”.  The shift from concentrating on pure thought and the final cause (why) to concentrating on observations and effective causes (how) was the driving factor in the development of science.  Science has now so completely swept Aristotle aside that, at the present time, only the effective cause is considered a cause in the “cause and effect” sense.

However, in dealing with human activities all four of these types of causes are useful. For example consider TRIUMF where I work. The formal cause is the five-year plan given in a brilliantly written (OK. I helped write it and they pay my salary so what else could I say) 800-page book that lays out the program for the current five years and beyond. The material cause is what TRIUMF is built of (many tons of concrete shielding among other things). The effective cause is the people and machines that make TRIUMF work. The final cause is TRIUMF’s purpose as given in the mission and vision statements. A similar analysis can be done for any organization. The usefulness of the final cause concept is shown by it being resurrected in good management practice under the heading of mission and/or vision statements.

Now, when we go from human activity to animal activity, we lose the formal cause. Consider a bird building a nest. The material cause is what the nest is built of, the effective cause is the bird itself and the final cause is to provide a safe place to raise its young. But the formal cause does not exist. It is doubtful the bird has a blueprint for the nest; rather the nest is built as the result of effective causes – the reflexive actions of the bird. No bird ever wrote an 800-page book outlining how to build a nest. Just as well, or the avian dinosaurs (otherwise known as birds) would have gone extinct along with the non-avian ones.

A similar analysis exists for simpler organisms. A recent study of yeast showed why (in the sense of the final cause) yeast cells clump together: to increase the efficiency of extracting nutrients from the surroundings. Thus in dealing with human, animal or even yeast activities, science can and does answer the why or final cause question. In the case of the yeast the effective cause would be the method the yeast cells used to do the bonding and the material cause the substances used for the bonding.

When we go from animate to inanimate we lose, in addition to the formal cause, the final cause. Aristotle explained the falling of objects in terms of a final cause: the objects wanted to be at their natural place at the center of the universe, which Aristotle thought was the center of the earth. The reason they speed up as they fell was they became jubilant at approaching their natural place (I am not making that up). Newton, in contrast, proposed an effective cause: gravity. There was no goal, ie final cause, just an effective cause. A river does not flow with the aim of reaching the sea but just goes where gravity pulls. Similarly with evolution by natural selection, it has no aim but just goes where natural selection pulls. This freaks out those people who insist on formal and final causes. With much ingenuity, they have tried to rectify the situation by proposing formal and final causes:  intelligent design and theistic evolution respectively.  Intelligent design posits that at least some of the structures found in living organisms are the result of intelligent design by an outside agent and not the result of natural selection while theistic evolution posits that evolution was controlled by God to produce Homo Sapiens. Neither has been found to increase the ability of models to make accurate predictions; hence they have no place in science.  It is this lack of utility not the role of a supernatural agent that leads to their rejection as science.

To summarize: for the activities of living things, science can and does answer the why question and assigns a final cause. However, for non-living things science has not found the final cause concept to be useful and has eliminated it based on parsimony. Aristotle, his followers and disciples made the mistake of anthropomorphizing nature and assigning to it causes that are only appropriate to humans or, at best, living things.

Additional posts in this series will appear most Friday afternoons at 3:30 pm Vancouver time. To receive a reminder follow me on Twitter: @musquod


Le séminaire spécial sur les derniers résultats dans la quête pour le boson de Higgs tenu hier au CERN fut sûrement la présentation la plus excitante de ma carrière. L’ambiance était électrique et l’auditorium était déjà plein à craquer plus de deux heures avant l’heure prévue.

Les membres de chaque collaboration, ATLAS et CMS, connaissaient à l’avance leur part des résultats mais sans avoir vu les principaux détails de l’autre collaboration. Les deux équipes avaient travaillé indépendamment sans partager leurs résultats. Chacun et chacune attendait donc avec impatience pour voir si l’autre groupe obtenait aussi des résultats similaires et concordant.

Mais les physiciennes et les physiciens sont bien connus pour leur réserve et à juste titre. Avant de clamer une découverte, on exige qu’en l’absence d’un boson de Higgs, la probabilité d’observer un certain excès d’évènements corresponde à moins de 0.00003% ou cinq sigmas.

En ce qui concerne le boson de Higgs, même si on entrevoit des signes prometteurs, on veut qu’il ait non seulement l’air d’un Higgs mais aussi qu’il chante, danse, sente et se comporte comme seul un Higgs peut le faire. A ce point-ci, cela « ressemble » seulement à un Higgs ayant une masse autour de 124-126 GeV. Mais le niveau de confiance est beaucoup trop faible pour qu’aucune des expériences puisse se prononcer avec des signaux de seulement deux à trois déviations standards, bien loin des cinq sigmas qui ne laissent aucune ambiguïté.

Plus le nombre de déviations standards est élevé, plus les résultats deviennent incompatibles avec l’hypothèse de bruit-de-fond seulement et absence du Higgs.

Bien sûr, le fait que les deux collaborations obtiennent de petits excès similaires, non seulement dans un mais plusieurs canaux différents, renforce la possibilité qu’on soit en train d’observer les premières manifestations du boson de Higgs. Comme un de mes collègues l’a expliqué, chaque canal de désintégration est un peu comme une façon de faire la monnaie pour un billet, le Higgs étant le billet. Le fait que tous ces canaux de désintégration donnent la même masse semble indiquer qu’ils viennent tous de la même particule.

La porte-parole d’ATLAS, Fabiola Gianotti, a montré qu’ATLAS obtient de petits excès d’évènements dans deux canaux de désintégrations différents, tous autour de 126 GeV. Ces deux canaux sont: lorsque le Higgs se désintègre en deux photons, ou en quatre leptons (électrons ou muons). Un troisième canal, celui où le Higgs se brise en deux WW, chacun allant en une paire de lepton et neutrino, est consistent avec les deux premiers sans que l’effet soit très prononcé.

CMS, représenté par son porte-parole Guido Tonelli,, a présenté des résultats basés sur cinq canaux différents, ajoutant donc les désintégrations du Higgs en quarks lourds ou en paires de taus, en plus des trois canaux utilisés par ATLAS. Une fois combinés, ces excès sont compatibles avec la présence d’un Higgs, la valeur la plus probable se trouvant à 124 GeV mais la quantité de données disponible n’est pas suffisante pour trancher. L’excédent observé pourrait être dû à une variation statistique des bruits-de-fond connus, avec ou sans la présence d’un Higgs dans cet intervalle de masse.

La probabilité d’obtenir un excédent d’évènements de la taille ou excédant celui observé dans l’éventualité de l’inexistence du Higgs, avant d’inclure les corrections associées à l’effet de « regarder ailleurs » (look-elsewhere effect). Comme on peut le voir, les excès observés coïncident pour deux canaux et sont compatibles avec le troisième canal. La validité statistique demeure modeste mais le fait que ces trois canaux concordent, dont deux canaux particulièrement robustes, renforce la possibilité de l’existence du Higgs.  Cependant, le signal excède légèrement celui qu’on observerait en présence d’un boson de Higgs de 126 GeV, comme l’indique la courbe noire en pointillé.

Le petit excès d’évènements observeé par CMS dans cinq canaux différents. La ligne en pointillé montre ce qu’on attendait en l’absence du boson de Higgs. Les bandes en vert et jaune représentent les marges d’erreur (à 1 ou 2 déviations standards). La courbe en noir montre les résultats obtenus. Plus les excursions de cette courbe en dehors de la bande jaune sont prononcées, plus la possibilité d’y trouver le Higgs s’accroît. C’est donc la valeur à 124 GeV qui est privilégiée.

En combinant tous les canaux, ATLAS obtient une déviation de 2.3 sigma et CMS 1.9 sigma par rapport au bruit-de-fond et lorsqu’on inclut la probabilité qu’une telle variation se produise n’importe où dans l’intervalle de masses étudiées (effet appelé « regarder ailleurs»). La probabilité d’obtenir une telle fluctuation en l’absence d’un boson de Higgs est d’environ 1% pour ATLAS et 2.9% pour CMS.

Si on néglige de « regarder ailleurs », la déviation locale correspond à 3.6 sigmas pour ATLAS. Cette valeur peut être comparer à la déviation attendue si un Higgs d’environ 126 GeV existe, soit 2.3 sigma. ATLAS voit donc un peu plus d’évènements que ce à quoi on s’attendait. C’est précisément la nature des fluctuations statistiques: elles peuvent aller dans les deux directions. Plus de données viendront éclaircir cette situation.

CMS ayant déjà combiné l’ensemble des données recueillies en 2010 et 2011, Guido Tonelli a montré qu’ils excluent toutes les masses possibles du Higgs sauf entre 114 et 127 GeV avec un intervalle de confiance de 95%. ATLAS exclut les masses entre 131 et 453 GeV avec le même niveau de confiance, et aussi entre 114 et 115.5 GeV.

L’ensemble des valeurs de masse possible du Higgs exclues par CMS. La ligne en pointillé représente ce qui était attendu tandis que la courbe en noir montre les résultats observés. Chaque fois que cette courbe passe sous la ligne horizontale rouge, la valeur de masse qui y correspond est exclue. Toutes les valeurs de masse pour le Higgs au-dessus de 127 GeV sont donc écartées.

Bien sûr, tout le monde aimerait crier victoire et déclarer la découverte du Higgs chose faite. Mais c’est encore trop tôt, malgré ces signes encourageants. Plus de données seront recueillies en 2012 et nous permettront une réponse sans équivoque sur l’existence ou l’inexistence du Higgs. Si les petits excédents que l’on observe aujourd’hui continuent à croître, en devrait bientôt entendre le Higgs nous pousser sa petite chanson, et à pleins poumons.

Pauline Gagnon

Pour plus d’information, visitez le site du CERN

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution




The special CERN seminar on recent Higgs boson results held yesterday was one of the most exciting presentations I ever attended. The ambiance was electricifying and the room was packed more than two hours before it even started.

Members of each collaboration working on this, namely CMS and  ATLAS, both knew their half of the story. But the two teams had worked independently and all the crucial details of the final results were not known outside the collaborations. Everybody wanted to see if the small excesses observed in their team coincided with similar findings from the other collaboration.

Physicists are notoriously cautious for good reason. To claim a discovery, we ask that if there is only background (and no Higgs), the odds of seeing an excess of event as large as the one observed be less than 0.00003% or 5-sigma.

In the case of the Higgs boson, if we find some signs of its possible presence, we will want it to do much more than just ”look like” a Higgs but also behave like one, smell like one, dance and sing like only that particle can do. As it is, it may look like a Higgs with a mass somewhere around 124-126 GeV but the level of confidence is way too low to draw conclusions. Each experiment has small signals at the 2-3 sigma level, which is what is expected if there is a Higgs boson given current data size. To reach the unambiguous 5 sigma-deviation level will require adding new data.

The higher the number of sigma, the more incompatible the data are with having only background and no Higgs.

Of course, it is encouraging that both groups find similar results, not only in one decay mode, but in multiple channels. A decay channel represents one of the many ways the Higgs boson can decay. As one of my colleagues put it, if the Higgs boson was a large coin, each decay channel would represent one way to break this coin to make small change. CMS and ATLAS collected all events corresponding to specific decay channels. The fact that they all point somewhere to roughly the same mass value is an indication they could all be coming from the same particle.

ATLAS spokesperson, Fabiola Gianotti, presented the ATLAS findings first.

Two separate decay channels both favour a mass value around 126 GeV: Higgs decaying into two photons and Higgs into two Z bosons, with each Z going into a pair of electrons or muons. A third channel with Higgs decaying into two W bosons, each W decaying into an electron or muon plus a neutrino is also consistent with this hypothesis but at a lesser level.

Guido Tonelli, CMS spokesperson, showed the combination of five different channels, adding the Higgs to two taus and Higgs to pairs of heavy quarks to those investigated by ATLAS. The combined results are compatible with a Higgs signal, the highest probability being found at 124 GeV, but not enough data were available to draw any definitive conclusions. The observed excess of events could be a statistical fluctuation of the known background processes, either with or without the existence of the Standard Model Higgs boson in this mass range.

The probability of obtaining an upward fluctuation as large or larger than that is observed if there is only background, prior to accounting for the look-elsewhere effect. As one can see, the excess falls in the same position for two different search channels and is also compatible with a much smaller excess in the third channel. The statistical significance is still modest but having three channels, especially two robust ones, is an indication this could be real. Nevertheless, this is a stronger signal than what was expected from a Standard Model Higgs boson with a mass of 126 GeV, which is shown by the black dashed curved.

The small excess of events observed by CMS in five different decay channels. The dotted line shows what was expected in the absence of a Higgs boson. The green and yellow bands represent the 1-sigma and 2-sigma error margin on this prediction. The black curve is the observed data. Excursions beyond the yellow band indicate where a Higgs signal is the strongest. The most significant value is found for a Higgs mass around 124 GeV.

When all their channels are combined, ATLAS obtains an excess of 2.3 sigma over background, while CMS gets 1.9 sigma, after taking into account the “look-elsewhere effect”, namely how often when looking at all the possible mass points under study would one point fluctuate that much. The chance of obtaining an upward fluctuation this large or larger if there is only background is 1% for ATLAS and about 2.9% for CMS.

Without the “look-elsewhere” correction, the ATLAS probability of such an excess of events if there is only background 3.6 sigma. This value can be compared to the 2.4 sigma deviation one would expect if the excess was due to a Higgs boson. So ATLAS sees slightly more events than what is expected from a Standard Model Higgs boson. Statistical fluctuations can happen in both directions, which is why caution is required until more data is analyzed.

Having already combined all data for 2010 and 2011 from more channels, CMS showed they now exclude all possible Higgs masses from 127 to 600 GeV with a 95% confidence level, leaving only a narrow window open between 114-127 GeV.  ATLAS excludes masses above 131 GeV up to 453 GeV with the same confidence level, but also between 114-115.5 GeV.

The exclusion limits presented by the CMS collaboration. The dotted curve shows what was expected while the black line with dots indicates what is observed. Whenever this curve falls below the red line is excluded. All masses above 127 GeV are now excluded at 95% confidence level.

Of course, everybody would love to be able to say: that’s it! We found it. But it is still premature despite encouraging signs. More data will be collected in 2012. The answer will then become unambiguous: we will either discover the Higgs or rule it out completely. If the small effects presented today keep growing, we will then see the Higgs do its little song and dance.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline or sign-up on this mailing list to receive and e-mail notification.

For more information, visit the CERN website or ATLAS and CMS websites










After the talk is before the talk

Wednesday, December 14th, 2011

… and I am not really talking about the Higgs status presentations yesterday at CERN, even though I have to admit that this has us all abuzz with excitement and speculations about the possibilities. And, the excitement is spreading to the general public: Today, the biggest newspaper in Munich, the “Sueddeutsche Zeitung”, has an LHC event display on the front page, above the fold.

Simulated Higgs decay into 2 muons at a 3 TeV CLIC collider, for an assumed Higgs mass of 120 GeV.

Although nothing really has changed (yet), the world feels a bit different today. Not a bad starting point for a colloquium on a possible future project in particle physics, a high-energy linear e+e- collider, which I’ll be giving in Prague this afternoon. Higgs physics is high on the list of things to do at such a machine, which promises to provide very precise measurements of its properties, and its coupling to other particles, which will show us its connection to particle masses. Some of these measurements have been studied in some detail for the CLIC conceptual design report just recently, which shows that, given enough luminosity and running time, even the very rare decay of a Higgs particle going into two muons might be measurable, with a statistical precision of a bit better than 25%.

Lets see how the LHC results develop, they might give us a whole new field to study…


Today’s public seminar at CERN, where the ATLAS and CMS collaborations presented the preliminary results of their searches for the Standard Model (SM) Higgs boson with the full dataset collected during 2011, is a landmark for high-energy physics!

The Higgs boson is a still-hypothetical particle postulated in the mid-1960s to complete what is considered the SM of particle interactions. Its role within the SM is to provide other particles with mass. Specifically, the mass of elementary particles is the result of their interaction with the Higgs field. The Higgs boson’s properties are defined in the SM, apart from its mass, which is a free parameter of the theory. (more…)