## Archive for November, 2012

### Le mystère plane toujours sur le boson de Higgs

Thursday, November 15th, 2012

Depuis la découverte en juillet dernier de ce qui pourrait être le boson de Higgs, les physiciennes et physiciens des expériences CMS et ATLAS essaient de trouver sa véritable identité. Est-ce vraiment le boson de Higgs prédit par le  modèle standard de la physique des particules ou une autre type de boson de Higgs relié à une théorie différente ?

Pour en avoir le cœur net, nous devons vérifier toutes ses propriétés, par exemple comment et dans quelles proportions il se désintègre. On doit aussi établir son spin et sa parité, deux propriétés des particules fondamentales.

Le nouveau boson a une courte durée de vie, il se désintègre tout de suite après avoir été créé. Il peut alors se briser de cinq façons différentes observables au Grand Collisionneur de Hadrons (LHC): en produisant deux photons, deux bosons Z ou W, deux quarks b ou encore deux leptons taus (une particule semblable à l’électron mais 3500 fois plus lourd). Il nous faut établir si chaque mode de désintégration existe et s’il se produit au taux prévu.

L’été dernier, lors de l’annonce de sa découverte, les deux expériences n’avaient des résultats clairs que pour les trois premiers modes. L’échantillon de données était alors trop petit pour voir des désintégrations en une paire de quarks b ou de taus.

Avec maintenant plus de données, les deux expériences ont pu montrer des résultats dans tous les canaux à une conférence aujourd’hui à Kyoto comme on peut le voir sur les graphes ci-dessous. La figure de gauche montre les résultats de CMS et celle de droite, ceux d’ATLAS.

Les valeurs de “σ ⁄ σSM” et “μ” sont équivalentes et représentent le rapport entre ce qui est observé et ce que le modèle standard prévoit. Une valeur de 1 signifie que tout concorde avec la théorie, et zéro implique que ce canal de désintégration n’est pas observé. Toute autre valeur implique que l’on voit ce canal mais qu’il se produit à un taux différent de celui auquel on s’attendait. Il faut bien sûr tenir compte des marges d’erreur avant de tirer une quelconque conclusion.

Les deux expériences ont maintenant des résultats pour les canaux de désintégrations en paires de quarks b ou en taus et les marges d’erreurs sont réduites pour plusieurs canaux. Pour l’instant, CMS obtient une valeur combinée de 0.88 ± 0.21 tandis qu’ATLAS mesure 1.3 ± 0.3. Les deux mesures sont donc compatibles avec 1.

La présence de ces cinq canaux serait compatible avec un boson de spin 0. Si en plus les taux de désintégration correspondent, ce nouveau boson aurait de plus en plus l’air du boson de Higgs mais ce ne serait toujours pas suffisant. Il faudra aussi qu’il soit de parité positive, comme le prédit le modèle standard.

Le spin d’une particule fondamentale réfère à sa rotation sur elle-même. La parité est reliée à ce qui arrive quand on inverse une direction dans l’espace. Voit-on la même chose lorsqu’on l’observe directement ou à travers un miroir quand la droite et la gauche sont inversées? Les particules possédant une parité positive agissent de la même façon qu’on les regarde directement ou dans un miroir.

On peut déterminer la parité d’une particule en observant la direction prise par ses débris quand elle se désintègre. Dépendamment de sa parité, ils s’éloigneront de préférence dans une direction plutôt qu’une autre. Par exemple, CMS a mesuré les angles entre les quatre électrons ou muons produits quand un boson se désintègre d’abord en deux bosons Z, eux-mêmes donnant une paire d’électrons ou de muons. Puis ils-elles ont comparé les distributions avec deux standards : l’un établi pour une parité négative, l’autre positive comme on le voit sur la figure ci-dessous.

La courbe de gauche en bleu montre la probabilité que l’on mesurerait pour un point en particulier pour une particule de parité positive alors que celle de droite en rose donne cette probabilité pour une parité positive. La valeur mesurée par CMS, indiquée par la flèche verte, indique clairement que le nouveau boson a plus certainement une parité positive telle que prescrite par le modèle standard.

CMS a aussi commencé à chercher d’autres bosons au-delà de la limite de 600 GeV exclue jusqu’à maintenant. Si de nouveaux bosons apparaissent, cela pourrait signifier que celui qui a été trouvé n’est qu’un des cinq bosons prévus par la supersymmétrie, une autre théorie, et non pas l’unique boson de Higgs du modèle standard.

Alors, où en est-on? Avec plus du double de données utilisées en juillet, les scientifiques sont passé-e-s de la quête d’un boson élusif aux premières mesures de ses propriétés. Lorsqu’on aura établi sans équivoque tous les canaux de désintégration, leur taux, le spin et la parité de cette particule, on en saura plus sur son identité.

Pour l’instant, bien qu’il soit encore trop tôt pour se prononcer, ce boson semble avoir de plus en plus l’air et la chanson du boson de Higgs. On en saura encore un peu plus en mars prochain quand toutes les données auront été analysées et améliorées. Mais cela pourra prendre du temps avant qu’il ait dit son dernier mot.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

### The mystery remains on the Higgs boson

Thursday, November 15th, 2012

Ever since the discovery of what might be the Higgs boson last July, physicists from the CMS and ATLAS experiments have been trying to pinpoint its true identity. Is this the Higgs boson expected by the Standard Model of particle physics or some “Higgs-like boson” befitting a different theoretical model?

To tell the difference, we must check all its properties, like how often this boson decays into different types of particles, and determine its spin and parity, two properties of fundamental particles.

Since the new boson has a short lifetime, it breaks apart immediately after being created. There are five ways a Standard Model Higgs boson should decay that we can study at the Large Hadron Collider (LHC): breaking into two photons, two W or two Z bosons, two b quarks or two tau leptons in well defined proportions.  We must check both the presence of and the rate at which each decay mode occurs.

Last summer, just after the discovery of the new boson, both experiments reported unambiguous observations in only three channels. Unfortunately, the data sample was still too small to really be able to check if the new boson could decay into a pair of b quarks or tau leptons.

With more data available, the two experiments have just shown results for all channels today at a conference held in Kyoto as shown on the two figures below.

The left figure is for CMS and the right one for ATLAS. The values “σ/σSM” and “μ” are equivalent and represent the ratio of what is seen to what is expected from the Standard Model. So if μ is exactly one for a given channel, it means that channel decays at the rate expected from the theory. A value of zero would imply this particular decay channel is not seen at all, contrary to expectation. If μ has any other value, it implies the new boson does not behave quite as predicted. But one must take into account the error margin (the horizontal bar) before drawing any conclusion.

Both experiments now measured decays into two b quarks and two tau leptons and the errors have gone down for several channels. For now, CMS obtains a combined value of 0.88 ± 0.21whereas ATLAS mesures 1.3 ± 0.3. Both are compatible with 1.

The confirmed presence of all five modes would be compatible with a spin-zero particle. Having in addition all the correct decay rates would make the new boson look much more like a Higgs boson but it would still not be quite sufficient. The new boson must also have positive parity as the Standard Model predicts.

The spin of a fundamental particle refers to its rotation on itself, as the name suggests. Parity has to do with flipping direction in space, exactly like what happens when we watch an event directly or through a mirror where the left and right directions are inverted. Particles with a positive parity look the same when you observe them directly or through a mirror.

The parity can be determined by looking at the direction taken by all fragments after the boson decays. Depending on its parity, its debris will fly in a preferred direction. For example, CMS measured all angles between the four electrons or muons when the “Higgs-like” bosons decay into two Z bosons, each one ending in a pair of electrons or muons. Then they compared the distributions with two standards: one for positive, one for negative parity as shown on the figure below.

The left curve in blue shows the probability one would measure for a particular point on the horizontal axis if the new boson had a negative parity. The right curve in pink shows the same for a particle with positive parity. The value measured by CMS (green arrow) indicates the new boson most likely has a positive parity as expected by the Standard Model.

CMS also started looking for other bosons with masses beyond 600 GeV, the current excluded limit. If new bosons turn up, it could mean we have found one of the five Higgs bosons predicted by supersymmetry, a new theoretical model, and not the single Higgs boson predicted by the Standard Model.

So where do we stand? With more than twice as much data as shown in July, scientists have moved from searching for this elusive particle to starting to measure its properties. Once the decay channels, decay rates, spin and parity are clearly established, we will be able to determine its identity.

It is still too early to tell but the new boson looks like, sings like and dances more and more like a Higgs boson. More certainty will come out next March at a winter conference with still more data and improved analyses. But it will take a long time to figure out beyond any doubt if the discovered boson was really the Standard Model Higgs boson.

Pauline Gagnon

### Mixing it up

Wednesday, November 14th, 2012

One of the other results presented at the Hadron Collider Physics Symposium this week was the result of a search for $$D^{0}–\bar{D}^{0}$$ mixing at LHCb.

Cartoon: If a $$D^0$$ is produced, at some time t later, it is possible that the system has "oscillated" into a $$\bar{D}^0$$. This is because the mass eigenstates are not the same as the flavor eigenstates.

Neutral meson mixing is predicted for any neutral meson system, and has been verified for the $$K^0–\bar{ K}^0$$, $$B^0–\bar{B}^0$$ and $$B_s^0–\bar{B_s}^0$$ systems. However, for the $$D^0–\bar{D}^0$$ system, no one measurement has provided a result with greater than $$5\sigma$$ significance that mixing actually occurs, until now.

The actual measurement is of $$R(t)=R$$, which is effectively the Taylor expansion of the time dependent ratio of $$D^0 \rightarrow K^+ \pi^-$$ (“Wrong Sign” decay) to $$D^0\rightarrow K^- \pi^+$$ (“Right Sign” decay). Charge conjugates of these decays are also included. There is a “Wrong Sign” and a “Right Sign” because the Right Sign decays are much more probable, according to the standard model.

The mixing of the $$D^0–\bar{D}^0$$ system is described by the parameters $$x = \Delta m /\Gamma$$ and $$y = \Delta \Gamma / 2\Gamma$$, where $$\Delta m$$ is the mass difference between the $$D^0$$ and $$\bar{D}^0$$, $$\Delta \Gamma$$ is the difference of widths of the mass peaks, and $$\Gamma$$ is the average width. What appears in the description of $$R$$, however, is $$x’$$ and $$y’$$, which give the relations between the $$x$$ and $$y$$ with added information about the strong phase difference between the Right Sign and Wrong Sign decays. The important part about $$x’$$ and $$y’$$ are that they appear in the time dependent terms of the Taylor expansion of $$R$$. If there were no mixing at all, then we would expect the ratio to remain constant, and the higher order time dependence to vanish. If mixing does occur, however, then a clear, non-flat trend should be seen, and hence a measurement of $$x’$$ and $$y’$$. That is why the time dependent analysis is so important.

Fit of ratio of WS and RS decays as a function of decay time of the D meson. Flat line would be no mixing, sloped line indicates mixing. From http://arxiv.org/pdf/1211.1230.pdf

Result of the mixing parameter fit of the neutral D meson system. 1,3 and 5 standard deviation contours are shown, and the + represents no mixing. From http://arxiv.org/pdf/1211.1230.pdf

The result is a 9.1 $$\sigma$$ evidence for mixing, which is also in agreement with previous results from BaBar, Belle and CDF. On top of confirming that the neutral D meson system does mix, this result is of particular importance because, coupled with the result of CP violation in the charm system, it begs the question whether or not there is much more interesting physics beyond the standard model involving charm just waiting to be seen. Stay tuned!

### Huge impact from a tiny decay

Wednesday, November 14th, 2012

The Hadron Collider Physics Symposium opened on November 12 in Kyoto on a grand note. For the first time, the LHCb collaboration operating at the Large Hadron Collider (LHC) at CERN showed evidence for an extremely rare type of events, namely the decay of a Bs meson into a pair of muons (a particle very similar to the electron but 200 times heavier). A meson is a composite class of particles formed from a quark and an antiquark. The Bs meson is made of a bottom quark b and a strange quark s. This particle is very unstable and decays in about a picosecond (a millionth of a millionth of a second) into lighter particles.

Decays into two muons are predicted by the theory, the Standard Model of particle physics, that states it should occur only about 3 times in a billionth of decays. In scientific notation, we write (3.54±0.30)x10-9 where the value of 0.30 represents the error margin on this theoretical calculation. Now, the LHCb collaboration proudly announced that they observed it at a rate of (3.2+1.5-1.2)x10-9 , a value very close to the theoretically predicted value, at least within the experimental error.

Here is the plot shown by the LHCb collaboration for the number of events found in data as a function of the combined mass of the two muons. The solid blue line represents the sum of all types of events from known phenomena containing two muons. The dashed curve in red shows the number of events coming from a Bs meson. With the current error margin on the measurement (shown by the

vertical and horizontal bars on the data points), the data seem to agree with all expected contributions from known sources, leaving little room for new phenomena.

This represents a great achievement, not only because this is the rarest process ever observed, but because it puts stringent limits on new theories. Here is why.

Theorists are convinced that a theory far more encompassing than the Standard Model exists even though we have not detected its presence yet. As if the Standard Model is to particle physics what the four basic operations (addition, multiplication, division and subtraction) are to mathematics. They are sufficient to tackle daily problems but one needs algebra, geometry and calculus to solve more complex problems. And in particle physics, we do have problems we cannot solve with the Standard Model, such as explaining the nature of dark matter and dark energy.

A good place to catch the first signs of “new physics” is where the Standard Model predicts very faint signals such as in Bs mesons decaying into two muons. These decays occur extremely rarely because the Standard Model only has limited ways to produce them. But if an additional mechanism comes into play due to some new theory, we would observe these decays at a rate different from what is expected within the Standard Model.

This is a bit like using the surface of a lake to detect the presence of an invisible creature, hoping its breath would create a ripple on the water surface. It would only work if the lake were extremely calm or disturbed only by an occasional tiny fish.  Here the Standard Model acts like all known little animals creating ripples on the water surface.  The hope was to detect other ripples in the absence of known causes (fish, frogs or mosquitoes). The LHCb result reveals no extra ripples yet. So either the new creature does not breathe as expected or we need to find another method to see it. It will be easier to know once the error margin is reduced with more data.

This new result pushes the reach for new physics even further. Nevertheless, it will help theorists eliminate faulty models like on the plot below and eventually zoom on the right solution. Meanwhile, experimentalists will have to devise yet more stringent tests to be able to discover the way to this new physics.

This plot shows how this measurement (horizontal axis) shown earlier this year reduced the space where new physics could be seen. With this new result, the constraints will even be stronger.

(For more details, see LHCb website)

Pauline Gagnon

### Impact majeur pour une toute petite mesure

Wednesday, November 14th, 2012

Le « Symposium des collisionneurs d’hadrons » s’est ouvert lundi  à Kyoto sur une bonne note.  Pour la première fois, la collaboration LHCb qui opère au Grand Collisionneur de Hadrons (LHC) au CERN a dévoilé la toute première évidence de désintégrations rarissimes, celles de mésons Bs en deux muons (une particule semblable à l’électron mais 200 fois plus lourde). Les mésons forment une classe de particules faites d’un quark et d’un antiquark, ici un quark b et un antiquark s.

Ces mésons sont très instables et se désintègrent en particules plus légères en une picoseconde, soit un millionième de millionième de seconde.

Ces désintégrations en deux muons sont prédites par la théorie, le modèle standard de la physique des particules. Cela devrait se produire environ 3 fois par milliards de désintégrations. En notation scientifique, on écrit (3.54±0.30)x10-9. La valeur de 0.30 représente la marge d’erreur théorique. La collaboration LHCb a donc annoncé avec beaucoup de fierté avoir mesuré (3.2+1.5-1.2)x10-9 ,  soit une valeur très proche de la valeur théorique en tenant compte de la marge d’erreur expérimentale.

Voici la distribution des valeurs obtenues par LHCb pour la masse combinée des deux muons. La courbe continue en bleu représente la somme de toutes les contributions de sources connues contenant deux muons. La courbe rouge en pointillés montre le nombre d’évènements venant de la désintégration de mésons Bs. Les points en noir représentent les données avec leur marge d’erreur (barres verticales et horizontales). Les données semblent correspondre à la somme de toutes les sources connues, laissant peu de place pour de nouveaux phénomènes.

Ce résultat constitue un véritable exploit, non seulement parce que c’est le plus petit taux de désintégration jamais mesuré, mais surtout parce qu’il impose de fortes contraintes sur tous les nouveaux modèles théoriques. Voici pourquoi.

Les théoriciens et théoriciennes sont convaincus qu’il existe une théorie plus complète que le modèle standard actuel, même si on n’a pas encore réussi à en détecter le moindre effet. Un peu comme si le modèle standard était à la physique des particules ce que l’arithmétique (addition, soustraction etc.) est aux mathématiques. Cela suffit amplement pour les opérations quotidiennes mais pour les tâches plus complexes, il nous faut l’algèbre ou le calcul intégral et différentiel.  En physique des particules, nous avons des problèmes que l’on ne peut résoudre avec le modèle standard, comme par exemple il n’explique pas la nature de la matière noire. On cherche donc les premiers signes de la « nouvelle physique ».

Un bon endroit où regarder pour détecter cette nouvelle physique est justement là où la théorie actuelle prédit des phénomènes très rares comme ces désintégrations de mésons Bs. Elles sont justement rarissimes car la théorie n’a que très peu de façons de les produire. Si cette nouvelle physique existe, on soupçonne qu’elle contribuera par de nouveaux mécanismes. On verrait peut-être alors ce genre de désintégrations se produire un peu plus souvent.

C’est un peu comme si on voulait utiliser la surface d’un lac pour déceler la présence d’une créature invisible en espérant voir les rides que son souffle produirait sur l’eau. Bien sûr, cela ne pourrait fonctionner que pour un lac très calme, à peine perturbé par un tout petit poisson ou un insecte dans l’espoir de voir apparaitre des vaguelettes venant d’une autre source. Le résultat actuel de LHCb montre qu’en fait aucune ride n’est visible sur le lac qu’on ne puisse imputer à une cause déjà connue. Si la situation demeure inchangée lorsque la marge d’erreur aura diminuée (en analysant dans les mois à venir les données en cours d’acquisition), on devra conclure que soit la nouvelle créature ne respire pas comme on le pensait, soit qu’il faudra élaborer une méthode plus efficace.

Ce nouveau résultat semble indiquer que la nouvelle physique sera plus difficile à révéler qu’on ne l’espérait. En attendant, cela permettra aux théoriciennes et théoriciens d’éliminer les nouveaux modèles inadéquats, ce qui finira éventuellement par les orienter dans la bonne direction. Entre temps, les expérimentateurs et expérimentatrices devront élaborer des techniques plus raffinées pour enfin mettre le doigt sur cette nouvelle physique.

Ce graphe démontre comment cette mesure (représentée par l’axe horizontal) effectuée plus tôt cette année avait déjà grandement contraint les valeurs permises pour différents modèles. Le nouveau résultat les renforcera encore davantage.

(Pour plus de détails, voir la page publique de LHCb (en anglais seulement))

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

### Foxes, hedgehogs and particle physicists

Sunday, November 11th, 2012

I’m just back from a trip to CERN, which was mostly for a week of meetings about how well the computing for the CMS experiment is doing and how it could be done better. But meanwhile, the collaboration was also working through the scrutiny of new measurements that are targeted to be released for the Hadron Collider Physics conference that starts tomorrow (I guess today, given the time zone) in Tokyo. Obviously I can’t discuss these results yet. So instead I’ll spend a little time on philosophy, which I admit makes this a much less interesting post. But bear with me.

In case you’ve been hiding under a rock for the last week, you should know that Nate Silver of the FiveThirtyEight blog made another successful prediction of a presidential election outcome, state by state. I’ve written about Nate Silver’s work here before, because I admire his adoption of what I think is a particle-physics kind of approach to making predictions.

So, being in a Nate Silver kind of mood as I headed off on my trip, I bought a e-copy of his new book, “The Signal and the Noise,” to read on the plane. I’m not sure that I’d call it a great work of literature, but Silver does have some very interesting things to say about how to make predictions. In one section he reminds us of the classic Isaiah Berlin essay, “The Hedgehog and the Fox.” And in case you have been hiding under a different rock, that refers to a quote from Archilochus, who observed that “the fox knows many things, but the hedgehog knows one big thing.” Hedgehogs view the world through the prism of a single big conceptual framework, while foxes don’t believe that’s possible and are willing to be more flexible in their approaches. Silver asserts that it’s the foxes of the world who make better predictions. You have to be willing to try many different approaches and integrate many different tactics to make a good prediction, and, perhaps most importantly, to be prepared to adapt to new information and to change your ideas when your current framework isn’t working,

This got me thinking: are particle physicists foxes or hedgehogs? I would say some of both. Our hedgehog-ness is in our belief in physical law. That’s a big idea that is unavoidable. It is true that our knowledge of physics is always subject to revision in the face of new information, but we believe that in circumstances that have already been well-explored through experiment, physical laws hold without question. Certainly the much-revered standard model of particle physics is taken as a given in regimes where it has been thoroughly tested. And at the very least, we believe in physical law as a big, consistent framework at least as an ideal, if not something that we can truly realize.

But in terms of our approaches to experiment, we have to be foxes. I can say this about some of the results that will be shown at the HCP conference — these are hard measurements, and to get them done, we’ve had to use every trick in the book. A huge variety of techniques have been brought to bear to wring every last bit of useful information out of the data, and it has taken a gargantuan effort from a large team of people. I always come out of the detailed presentations of a measurement somewhat stunned by its complexity. We’re also always on the lookout for new and better tricks. No matter how good an idea sounds on paper, if it isn’t effective in making a measurement, or is less effective than other ideas, then you abandon it and find something better. It’s this flexibility and willingness to evolve and change that helps us do this work.

As a younger person, I saw myself as at least an aspiring hedgehog, hoping to find the one big idea that would pull everything together and give me a complete grasp of the world. But I’ve come to realize that life, and science, is more complicated than that, and you have to be a fox just to get through it all.

### Le Centre de Calcul de l’IN2P3 a inauguré son Musée de l’informatique !

Wednesday, November 7th, 2012

L’idée est née il y a quatre ans lors de la première édition du Festival Particule.com. Certains ingénieurs du CC-IN2P3 (Centre de Calcul de l’IN2P3), avec à leur tête Fabien Wernli, avaient envie de présenter d’anciens matériels informatiques aux collégiens et lycéens, de leur montrer l’évolution de l’informatique, quitte à passer pour de vrais ringards en parlant de minitels et de cartes perforées à ces accros aux iPhone et autres tablettes intelligentes.

Il y a quatre ans donc, a rapidement été organisée une exposition au rez-de-jardin du Centre, nous avons récupéré du matériel qui traînait sur des étagères poussiéreuses et demandé un coup de main à l’association Aconit (Association pour un COnservatoire de l’Informatique et de la Télématique). Cette exposition connut un vif succès auprès des élèves et de leurs professeurs, ainsi que du grand public.

Fort de ce succès, l’idée a été reprise, retravaillée. Après tout, pourquoi ne pas organiser une exposition permanente au sein du CC-IN2P3 ? Comment montrer l’évolution fulgurante de l’informatique ? Comment expliquer les concepts de base sans donner la migraine aux visiteurs ? mais aussi comment expliquer que les physiciens ont souvent été à l’avant-garde de certains développements, comme le web ?

L’idée a mûri, les équipes ont travaillé ensemble pour imaginer un projet qui permette de présenter l’informatique de manière ludique, interactive et de rappeler comment la physique a utilisé ces nouvelles technologies ces quarante dernières années.

C’est ainsi qu’est né le Musée de l’informatique du CC-IN2P3, qui sera officiellement inauguré le jeudi 11 octobre, en même temps que le lancement de la Fête de la Science dans le Rhône.

Cofinancé par la Région, le Musée de l’informatique du CC-IN2P3 a été présenté aux établissements scolaires en avant-première lors de la 2e édition du Festival Particule.com les 11 et 12 octobre, puis tout au long de l’année. Les élèves ont pu y découvrir l’évolution du matériel informatique à travers des vitrines, des vidéos, des jeux et des animations.

Le CC-IN2P3 remercie les personnes qui ont répondu à notre appel et nous ont envoyé du matériel.

PS : notez que le Musée de l’informatique a déjà sa première fan

– Article envoyé par Gaëlle SHIFRIN du Centre de Calcul de l’IN2P3

### THE ROLE OF PURE RESEARCH

Friday, November 2nd, 2012

Thomas Edison (1847 – 1931) was a genius. He was also the ultimately practical person devoted to producing inventions with commercial applications. His quote on airships from 1897 is typical: I am not, however, figuring on inventing an airship. I prefer to devote my time to objects which have some commercial value, as the best airships would only be toys. Fortunately the Wright brothers liked playing with toys and indeed the airplane was just a toy for many years after it was first invented. But just ask Boeing, Airbus, or even Bombardier if airplanes are still toys. Progress requires both the practical people, like Edison, and the people who play with toys, like the Wright brothers.

Let’s pick on Edison again. The practical Edison patented something known as the Edison effect, but did nothing more with it. The effect was this:  if a second electrode is put in a light bulb, it is found that an electrical current would flow if the voltage was applied in the right direction. This lead to the diode which improved radio reception and in the hands of people, who liked playing with toys, lead to the vacuum tube. The vacuum tube is now largely obsolete but began the electronics revolution. Again, we see that progress depends on the people who like playing with toys as well as the people concerned with immediate practical applications. The practical use of an observation, like the Edison effect, is frequently not immediately obvious.

With the light bulb, Edison played a different role. The light bulb is at the end of the chain of discovery. It relies on all the impractical work of people like Michael Faraday (1791 – 1867) and James Maxwell (1831 – 1879), who developed the ideas needed for the practical generation and transmission of electrical power. Without the power grid that their discoveries made possible, the light bulb would have only been a toy.

The discovery of radium is another example of a pure research project leading to practical results. At one time, radium was used extensively to treat cancer. To quote Madam Marie Curie[1] (1867 – 1934): We must not forget that when radium was discovered no one knew that it would prove useful in hospitals. The work was one of pure science. And this is a proof that scientific work must not be considered from the point of view of the direct usefulness of it. It must be done for itself, for the beauty of science, and then there is always the chance that a scientific discovery may become like the radium, a benefit for humanity.

An even more striking example of how serendipitously science advances technology is the modern computer. It relies on transistors which are very much quantum devices. The early development of quantum mechanics was driven by the study of atomic physics. So, I could just imagine Earnest Rutherford (1871 – 1937), an early experimenter in atomic physics, thinking: I want to help develop a computing device so I will scatter some alpha particles. Not bloody likely! The implications of pure research are simply unknowable. However, I doubt the Higgs boson will ever have practical applications. The energy scale is simply too far removed from the everyday scales.

But pure research contributes to society in another way. A prime example is the cyclotron. It was invented in 1932 for use in the esoteric study of nuclear physics. Initially, they were only in top physics departments and laboratories. Now they are in the basements of many hospitals were they are used to make rare isotopes for medical imaging and treatment. The techniques developed for pure research frequently find their way into practical use. The idea is captured nicely in the term: space age technology.  While standing on the moon did not produce any real benefits to mankind, the technology developed in the enterprise did; hence the term: space age technology.

Of course, I cannot leave this topic without bring up the World Wide Web. The initial development was done at CERN in support of particle physics. I remember a colleague getting all excited about this new software development, but initially it was something only a geek like her could love. The links were denoted by numbers that had to be typed in, no clicking on links. Then the National Center for Supercomputing Applications (NCSA) at the University of Illinois Urbana-Champaign developed a browser, Mosaic, with a graphical interface and embedded pictures. This browser was released in 1993 and looks much like any browser today. The rest is history. But, two other things were needed to make the World Wide Web a hit. The first was computers (those things that were developed from Rutherford scattering alpha particles) with sufficient capabilities to run the more powerful browsers and, of course, the internet itself. The internet was initially just an academic network but the World Wide Web provided the impetus to drive it into most homes. Here again we see a combination of efforts: academic at CERN and NCSA and commercial at the internet providers.

Thus, we see pure research providing the raw material for technological development. The raw material is either the models, like quantum mechanics, or the inventions like cyclotrons. These are then used by practical men like Edison to generate useful technology. However, there is also a cultural component: satisfying our curiosity. While the spinoffs may be the main reason politicians and the tax payers support pure science, it is not the motivation driving the scientists who work in pure science. In my own case, I went into physics to understand how the universe works. To a large extend that desire has been fulfilled, not so much by my own efforts but by learning what others have discovered. More generally, the driving force in pure science is curiosity on how the universe works and the joy of discovery. Like Christopher Columbus (1451 – 1506), Robert Scott (1868 – 1912) or Captain James Kirk (b. 2233), pure scientists are exploring new worlds and going where no man, or woman, has gone before.

[1] The first person to win two Nobel prizes.

### A Dalitz What Now?

Friday, November 2nd, 2012

Perhaps in your wanderings of physics papers, you’ve seen plots which look like this:

$$D^{+}\rightarrow K^{-} \pi^{+} \pi^{+}$$Dalitz Plot. Borrowed from Lectures by Brian Meadows, Cincinnati.

While yes, you may think that Easter has come early, this is actually an honest-to-goodness physics analysis technique. Developed by R.H. Dalitz in 1953, this plot illustrates visually the interference of the quantum mechanical amplitudes of the final state particles. Let’s take a step-by-step walk through of the plot.

The Setup

Dalitz plots were originally used to investigate a three body final state, for instance $$D^{+}\rightarrow K^{-} \pi^{+} \pi^{+}$$. Taking this example, let’s imagine we’re in the $$D^{+}$$ rest frame (it’s just sitting there), then the $$D^{+}$$ decays. The decay products can go a variety of directions, so long as momentum is conserved.

The directions in which the particles fly and with what momentum will determine where we are in the plot. For reference, we can label the daughters as 1, 2 and 3, then assign them masses $$m_1, m_2$$ and $$m_3$$, and momenta $$p_1, p_2$$ and $$p_3$$, respectively. Finally, let the $$D^{+}$$ have mass M. It’s momentum is 0 since it’s just sitting there. With a bit of algebraic manipulation, and Einstein’s relation $$E^2=p^2+m^2$$ (c=1, for simplicity of calculation), we can define a whole host of new variables, for instance $$m_{12}^2 = (E_1+E_2)^2-(p_1+p_2)^2$$.

The Axes

Let’s take  $$m_{12}^2 = (E_1+E_2)^2-(p_1+p_2)^2$$ as our guinea pig. Physically, we can think of this as combining particles 1 and 2 into a single particle, and then plot its effective invariant mass spectrum. This is quite similar to looking at the invariant dimuon mass squared of the Higgs searches. In this case, however, we then plot either $$m_{13}^2$$ or $$m_{23}^2$$ on the remaining axis. Since all of the momenta and energies are related, picking either $$m_{13}^2$$ or $$m_{23}^2$$ fully defines the system. This gives us all the ingredients we need for the plot!

The Boundary

PDG, Review of Kinematics

After setting up the axes above, we need to plot the actual figure. The boundary is completely described by energy and momentum conservation. For example, if you can ask “What is the minimum energy squared that particle 12 could have?” After a bit of consideration, you would say “why the addition of the two masses, then squared!” Likewise, the maximum energy it could have is the mass of the parent minus the mass of the other daughter, then squared. In this case, all of the momentum is then available to the $$m_{12}$$ system. Repeating this process for all values of $$m_{12}^2$$ then gives the complete boundary of the Dalitz plot. Some special spots are shown in the PDG plot above. Forming the complete boundary is not necessarily a simple task, especially if the particles are indistinguishable. For the sake of explanation, we will stick to our simple example here.

The Innards

Finally, the bulk of the Dalitz plot is defined by interactions of the final state particles. If these particles did not interact, then we would expect a completely flat distribution along the inside of the plot. The fact that these particles do interfere is due to the quantum mechanical probability of the initial state transforming into the final state given the interaction potential of the system. The result is a vast array of structure and symmetries across the plot. For the example of  $$D^{+}\rightarrow K^{-} \pi^{+} \pi^{+}$$, the result is shown above.  Each little dot is one event, and we can clearly see that there are places where the density is high (resonances, the so called “isobar model”), and places where there is almost no density at all (destructive interference).

The structures can be quite different depending on the spin of the resonance as well. For instance, the first plot shown below shows the resonance (where the boxes are bigger). This plot is actually Monte Carlo simulation for the process $$\pi^- p\rightarrow f_0 n\rightarrow\pi^0 \pi^0 n$$, produced with a $$f_0$$ mass of 0.4 GeV/c2. Since the $$f_0$$ is a scalar (spin 0), the resonance extends across the entire plot. In the second plot, the $$\rho(770)$$ is produced in the decay $$D^{-}\rightarrow K^{-} \pi^{+} \pi^{-}$$. This too is Monte Carlo. The fact that the $$\rho(770)$$ is a vector (spin 1) is what produces the distinct shape shown below.  This simple example shows how one can identify the spin of a resonance by visually inspecting the Dalitz plot.

MC $$f_0$$Dalitz Plot. From Crystal Ball Collaboration : http://arxiv.org.proxy.libraries.uc.edu/abs/nucl-ex/0202007"

$$\rho(770)$$ resonance in $$D^{-}\rightarrow K^{-} \pi^{+} \pi^{-}$$ From lectures by Brain Meadows.

Now, there’s a lot more to Dalitz plot analysis that what I’ve presented here. There can be reflections across the plot and different resonances interfering with each other in quite complicated ways. For example, in the decay $$D^{-}\rightarrow K^{-} \pi^{+} \pi^{-}$$, if we had a $$K^{*}_{0} (800)$$ interfere with the $$f_0$$, the Dalitz plot might look something like this:

$$K^{*}_{0} (800)$$ interfering with $$f_0$$ in decay $$D^{-}\rightarrow K^{-} \pi^{+} \pi^{-}$$. From Brian Meadows.

The distinct shape, which looks to my eye a bit like a butterfly, is due to the phase difference between the two resonances.

So now you at least have a bit of an intro to the Dalitz plot, in this all too brief and quite simplified example.