• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for May, 2011

Illusion ou réalité?

Tuesday, May 31st, 2011

Le 4 avril dernier, la collaboration CDF de Fermilab près de Chicago a publié un article qui avait fait la manchette et dont la nouvelle s’était répandu à une vitesse approchant celle de la lumière dans la communauté de la physique des particules. Et ils viennent tout juste de remettre à jour ce résultat, le rendant encore plus convaincant. C’est que l’annonce de l’observation potentielle d’une nouvelle particule, inconnue, insoupçonnée et inhabituelle passe rarement inaperçue! C’est ce que toute physicienne et tout physicien des particules attend depuis des années. C’est donc très excitant et si cela s’avère vrai, pourrait être la plus grande découverte depuis des décennies.

En cherchant certaines désintégrations rares de paires de bosons, plus particulièrement celles d’un boson W accompagné soit d’un autre W ou d’un boson Z, CDF a observé un excès d’évènements qui peuvent difficilement être attribués à une source connue. Les bosons une catégorie de particules, celle associée à une des forces fondamentales de la physique. Les bosons Z et W sont les porteurs de la force électrofaible.

Ces évènements à deux bosons, on les trouve en reconstruisant le premier W quand il se désintègre en un muon ou un électron, plus un neutrino. Ensuite on cherche les débris du deuxième boson lorsque celui-ci se défait en deux gerbes de particules. Des tels évènements sont rares mais sont tout de même prédits par le Modèle Standard, notre outil théorique qui jusqu’à maintenant décrit presque tout ce que l’on a observé en physique des particules. Ces paires de bosons rares donc, mais prévues, sont généralement enfouies sous ce qu’on appelle le bruit de fond, des évènements bien plus abondants mais beaucoup moins intéressants parce que bien connus. Ce bruit de fond a tendance à masquer le signal, l’imitant même, ce qui le rend bien difficile à distinguer des évènements que l’on recherche. C’est un peu comme si on regardait un tas de sable: dans notre cas, en gros 95% du tas est fait de sable ordinaire, et un petit 5% provient d’un métal rare saupoudré par dessus. Ce que CDF observe, c’est qu’en plus du métal précieux, ils pensent y voir aussi un peu d’or. Mais comme tout le monde le sait, tout ce qui brille n’est pas forcément or. La question est donc de savoir si cette observation correspond à quelque chose de réel, ou est-ce simplement une illusion.

L’élément clé ici est quand on dit « qu’en gros » 95% vient du bruit de fond. Pour estimer la quantité d’évènements rares (les paires de bosons) il faut connaitre non seulement la quantité exacte de bruit de fond parmi tous les évènements sélectionnés, mais aussi leurs caractéristiques. Sans cela, comment peut-on affirmer qu’il y a quelque chose en sus? La moindre incertitude dans la prédiction du bruit de fond peut se métamorphoser en un excès et créer l’illusion de la présence de particules inédites. Ou cela pourrait aussi être causé par une fluctuation statistique dans le nombre d’évènements observés. On ne prédit jamais le nombre exact d’évènements, il y a toujours une marge d’erreur. Mais tous les gens dans notre domaine le savent bien, y compris les centaines de physiciens et physiciennes du groupe CDF. S’ils publient ces résultats, c’est aussi pour inviter les membres des autres expériences à confirmer ou réfuter cet effet à partir de détecteurs et de données différents.

Regardez cette figure extraite de l’article de CDF. On y voit la distribution de la masse combinée des deux gerbes imputées à la désintégration du second boson. Sur l’image en haut à gauche, on montre la simulation de différents évènements bien connus prédits par le Modèle Standard. Ils sont indiqués en vert, blanc, bleu et gris. Si on retourne à notre analogie, tout cela représente le sable. La simulation pour les paires de bosons est en rouge (le métal rare). C’est ce que CDF voulait mesurer en tout premier lieu. Là, les deux gerbes viennent soit d’un W ou d’un Z, et c’est pourquoi on voit une accumulation autour de 80 et 91 GeV/c2, la valeur respective de la masse des bosons W et Z. Bien sûr, la reconstruction n’est pas parfaite, ce qui explique que la masse soit reconstruite aux environs de la masse connue de ces bosons. Tout cela est normal.

Là où ça devient intéressant, c’est ce qu’il se produit lorsqu’on soustrait tout le bruit de fond (la somme de toutes les contributions de sources connues) de ce que l’on observe réellement avec les données recueillies, les point montrés en noir sur la figure. C’est ce qu’on voit en haut à droite. Là, il semble bien y avoir un autre pic. En fait, si on rajoute une simulation d’une particule nouvelle avec une masse aux environs de 140 GeV/c2, soulignée en bleu dans la figure en bas à droite, la somme des sources connues plus cette inconnue semble bien correspondre aux données réelles. Tout cela se trouve à l’intérieur de la marge d’erreur indiquée par les lignes verticales en noir associées à chaque point. Quand on prend en compte cette marge d’erreur, la probabilité que cet excès soit dû à une fluctuation statistique est inférieure à 1%. Mais moins de 1% ne veut pas dire zéro…cela pourrait arriver.

Ici au CERN, l’annonce de CDF en avril dernier n’est pas passée inaperçue. Plusieurs équipes travaillant sur des sujets reliés ont vite revisité leurs données à la recherche de cette nouvelle particule potentielle. En fait, nous avions déjà plusieurs distributions semblables car nous aussi, dans les collaborations ATLAS et CMS avions cherché ces paires de bosons.

ATLAS avait déjà en main des distributions similaires. Donc dès le lendemain de l’annonce, on a pu y jeter un premier coup d’œil, et en quelques jours, refaire la sélection de ces évènements en reproduisant exactement l’analyse de CDF.

Même scénario chez CMS : encore une fois, beaucoup d’activité dans les jours qui ont suivi cette annonce, des meetings convoqués avec peu de préavis et d’abondants échanges de courriels avec pleins de figures en attachement.

Mais jusqu’à présent, rien de nouveau de notre côté. Cela ne signifie pas que cet effet n’existe pas. Bien sûr, il peut-être factice mais il se peut aussi simplement nous n’ayons pas encore analysé suffisamment de données pour le voir. Nous en saurons plus dès qu’on aura complété l’analyse des données recueillies en 2011, ce qui sera sans doute fait d’ici aux grandes conférences d’été.

Comme on peut le voir sur la figure ci-dessous publiée par ATLAS basée sur les données de 2010 seulement et en appliquant les même critères de sélection que CDF, il n’y a aucune différence entre ce qu’on observe dans les vraies données recueillies en 2010 (les points en noir) et les prédictions de la théorie montrées par les différentes contributions de sources connues (en couleur). Tout est consistent avec la marge d’erreur représentée par les hachures. Bien sûr, tout deviendra encore plus intéressant dès que CMS et ATLAS complèteront la calibration et l’analyse des données de 2011. Et on en a déjà dix fois plus qu’en 2010!


L’autre expérience de Fermilab, D0, est maintenant dans la meilleure position pour vérifier indépendamment cette observation. On s’attend à ce qu’ils publient leurs résultats très prochainement.

D0 à Fermilab, ainsi que CMS et ATLAS ici au CERN doivent maintenant confirmer ou réfuter l’hypothèse formulée par CDF. Si cette nouvelle particule existe, tout le monde devrait la voir, prouvant ainsi son existence. Car pour qu’on y croit, il nous faut cette seconde confirmation. D’ici là, on ne peut que se demander si on a pas affaire à une de ces fluctuations statistiques qui ne cessent de nous pourrir la vie. On en saura sûrement plus dès les premières conférences d’été. Restez donc à l’écoute!

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline

Share

Could this be real?

Tuesday, May 31st, 2011

On Monday, April 4th, the CDF collaboration from Fermilab released a new paper, which they have just now updated with stronger evidence, announcing they are seeing “something”. Read “something” as in unexpected, unusual, beyond anything known. Exactly what every experimental physicist hopes for! This news made headlines and spread through the High Energy Physics community faster than gossip. This is very exciting, and if it turns out to be real, it’d be the biggest discovery in many decades!

CDF found an excess of events when looking for some rare di-boson pairs, namely a W boson being produced in association with either another W boson or a Z boson. This second boson is seen through its debris, two jets of particles emerging when it disintegrates. Such di-boson events are rare but are predicted by the Standard Model, the theory that pretty much describes everything we have observed in this field so far. These rare but standard events are found amid so-called “background events”, more abundant, run-of-the-mill type of events that mimic the more unusual ones we are trying to study. It’s like looking at a pile of sand: in our case, we know roughly 95% of it is just sand, while 5% comes from some rare metal sprinkled on top of it. What CDF observes is that in addition to the rare metal, they might also see some unexpected gold dust. But as every one knows, all that glitters is not gold… The whole question is then: Is this real or just some fluke?

The key word here is that “roughly” 95% is background. To estimate the exact amount of rare events, you must know quite precisely not only how much background there is underneath, but also the specific characteristics of these events. Any small fluctuations in the background prediction and you would artificially create signs from a new particle. But there is another way to create an excess: even when we know precisely the amount of background to expect, this number is not fixed, but just what one should see on average. This can vary within known limits, but occasionally more than expected. That’s what we call a “statistical fluctuation”. But everybody in our field knows that, including the hundreds of experimentalists from CDF. So if they put this out in the open, they have very good reasons to believe it could be true. By going public about this, they are in fact inviting all the other experiments to search their own data.

Look at the CDF figure shown above where we can see the distribution of the combined mass of the two jets in these events. Going back to our analogy, the background is shown here on the top left plot in green, white, blue and grey, and represents the bulk of regular sand, all processes that are well known. The di-boson events shown in red would be the rare metal CDF was looking for. There, the two jets come from either a W or a Z boson. That’s why we can see a broad peak around 80 and 91 GeV/c2, the respective masses of W and Z bosons. Of course, due to some inaccuracies in the mass reconstruction, they don’t all come out at these exact values but instead form a broad peak in that area. That’s all typical.

What’s interesting here is what happens when you subtract all the background in the left top plot to obtain the right top plot. In principle, you should just be left with the events highlighted by the red line on the top right plot, but as you can see, there seems to be another peak at a higher mass, hinting at some unknown and completely unexpected new particle. This is even made more visible if you suppose those two jets are the debris from some particle with a mass of about 140 GeV/c2 which is shown by the blue line on the bottom plots, especially on the right after background subtraction. The vertical black lines indicate the level of uncertainty on each point after the subtraction. This means that even taking into account the known inaccuracies in the measurement, the second, blue peak has less than 1% chance of being due to a statistical fluctuation.

Coming back here at the LHC, you can be sure that when CDF released its paper in April, we all frantically scrutinized our own data, looking for this potential new particle. As it turned out, the ATLAS collaboration had already looked for di-boson events and even had very similar plots. So within a day, a small group of people gathered to share what they already had. A few more days, and they could reproduce exactly what the CDF Collaboration had done.

The same happened at the CMS collaboration. Again, lots of febrile activity, short-notice meetings, frantic exchanges of e-mails with various plots attached.

But so far, nothing! That does not mean it is not there. It can mean two things: either we are not sensitive enough yet or simply that this is not a real effect. We need more data to be able to make a definitive statement. And this new data is likely to be ready in time for the upcoming Summer conferences.

As can be seen from the figure released by ATLAS using the same selection criteria as CDF, we did not see a bump anywhere! The dots, the actual data collected in 2010, and the sum of all contributions from known sources represented by the black curve, are all within the experimental uncertainty shown by the hashed area. It will get more interesting once we add all of 2011 data since we already have ten times more data undergoing various checks and calibration.

The other Tevatron experiment, D0 is best positioned right now for an independent crosscheck and should release their findings very soon. If this new particle exists, they will see it and confirm it in the process. But until another experiment can provide this independent crosscheck, we will all be left pondering. D0 at Fermilab and CMS and ATLAS here at CERN need to either confirm or dismiss this claim, Until then, there is no way to tell if this is real or just one of those darned statistical fluctuations we are constantly battling… So stay tuned for the summer conferences where we are sure to hear more about this.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline

Share

Quite a while ago I wrote a post talking about the IceCube neutrino telescope and its potential to become the first detector to observe sources of high energy neutrinos in the sky. IceCube, located at the South Pole, detects neutrinos not by observing them directly, but by detecting the particle that is created when a neutrino interacts with the ice that surrounds the telescope or the rock underneath it. This charged particle, a muon most of the times, emits a bluish light called Cherenkov radiation which can be detected by the array of light sensors that make up IceCube.

This is how neutrino detection goes, but it turns out that not only neutrinos are able to produce muons that reach the pristine ice that surrounds IceCube. Cosmic rays (charged particles coming from the cosmos), in fact, account for most of the events seen by IceCube. In general, for every million of cosmic ray events seen by IceCube there is only one neutrino in our data set (which is most of the times produced by another cosmic ray on the other side of the Earth!) This is my very convoluted way to say that besides being a very nice neutrino detector, IceCube is also an amazing cosmic ray detector.

So, is there anything interesting that we can do with these cosmic rays that light up the detector at a rate of about 2000 events per second? Maybe we could plot their arrival directions in a sky map and see if they’re pointing us back to their sources.

Well, there’s a problem. We know that permeating the vicinity of the Solar System there’s a magnetic field that bends the trajectories of these protons (that have energies of tens of TeVs) in pretty much the same way that the LHC magnets bend the trajectory of protons around the collider ring. The main difference here is that the magnetic field in the solar neighbourhood is not so organised and neat as that of the LHC, and these protons would follow pretty chaotic paths before they reach the Earth, at which point they would not be pointing back to the source that originated them. This is why we should expect to see a completely featureless sky if we were to just plot the incoming direction of these TeV cosmic rays.

But you know that I would be writing about this if this were the end of the story. Last year, IceCube published its first map of the TeV cosmic ray sky, and we found that, actually, there are significant features in it. These features are very weak,  with the “hottest” spots in the sky differing from the number of events detected on the “coldest” spots by only parts in thousands. This is where a data set with a huge number of cosmic rays becomes handy; with IceCube gathering billions of cosmic rays events every year, we can measure these minute differences very accurately. This study, performed by fellow UW-Madison colleagues Rasha Abassi, Paolo Desiati, and Juan Carlos Diaz Velez with data taken when the detector was only one-quarter of its final size, revealed that the cosmic ray sky is anisotropic, and that the excess and deficit regions that are visible take about half the sky each.

Large scale anisotropy of cosmic rays as seen with the IceCube detector in its 22-string configuration. The red colour in this map indicates a deviation of 0.2% from a flat sky, while the blue indicates a deficit of the same strength.

The next question that we asked ourselves was: is that all that there is to it? Is this half-and-half feature the only remarkable thing about this cosmic ray sky? This is the question that we’ve been trying to answer for the past year with the group that I work with. Our group (Dr. Simona Toscano, Dr. Segev BenZvi, Prof Stefan Westerhoff, and myself) has focused its attention on the search for smaller structures in this cosmic ray sky, to see if there are features that are smaller in size than those previously reported. And here again the answer was yes!

Calculating the angular power spectrum of the sky map that we got for data taken with IceCube in its 59-string configuration (about 2/3 completed) we obtained the blue points shown in the graph below. The y axis shows a value that gives an idea of how strong the features in the sky are at a certain angular scale (given by the upper x-axis) We knew from the previous analysis that structures that are large (with sizes between 90 and 180 degrees in the sky) were present, but as you can see the blue points don’t go immediately into the grey bands which indicated what we should expect for a “featureless” sky but rather remain away from them up to angular sizes of 15 degrees.

This tells us that besides the large scale structure already reported by IceCube, there must be regions of excess and deficit of cosmic rays that have typical sizes of ~ 20 degrees in the sky (~40 times the size of the Full Moon.)

The power spectrum of the cosmic ray anisotropy detected by IceCube. The presence of a large scale structure is evidenced by the peak to the left, while smaller structure can be seen as a departure from the gray band regions for angular scales between 15 and 35 degrees. After the subtraction of the large scale structure, the small scale structure persists (seen in red dots), which indicates that the presence of smaller structures is not due to an artifact caused by the presence of the large scale anisotropy.

 

Using a technique that allows us to filter out the large scale structures to focus only on the smaller regions, we got the map shown below, where we can see localised regions of excess and deficit of cosmic rays coming across the Southern sky. We also see that, as we were expecting, these regions are about 20 degrees in size. The causes of these “hotspots” are still unknown, but we’re working to see if we can determine what’s causing them. Possible reasons include nearby pulsars, the configuration of the local interstellar magnetic field, or a combination of these two factors, but more information is needed to determine what the possible sources of this anisotropy may be.

This is how the Southern sky cosmic ray sky looks like at TeV energies once structures larger than ~60 degrees have been filtered out. Both regions with an excess (red) or deficit (blue) of cosmic rays when compared to an isotropic sky are clearly visible.

Similar excesses and deficits have been observed in the past by experiments located in the Northern hemisphere, but this is the first detection of this kind of structure in the Southern sky. You can take a look at the preprint of the paper we submitted to the Astrophysical Journal here. We’re right now trying to organise a workshop in October where we will discuss possible theories and the details of observations made in the North with colleagues from other experiments.

Interesting times ahead!

 

 

Share

Congratulations LHCb!!!

Saturday, May 28th, 2011

Just a quick post today to explain this LHC status from last night:

What was this about you ask? As I’ve mentioned previously, the target instantaneous luminosity for LHCb is \(2 \times 10^{32} cm^{-2} s^{-1}\) to \(3 \times 10^{32} cm^{-2} s^{-1}\).

LHCb started taking data within this target instantaneous luminosity on the 1st of May with 756 colliding bunches corresponding to an instantaneous luminosity of \(2.15 \times 10^{32} cm^{-2} s^{-1}\). Last night the experiment moved into unknown territory, collecting data at an instantaneous luminosity of \(3 \times 10^{32} cm^{-2} s^{-1}\).

Experts have been carefully monitoring the detector behaviour and data quality, but so far it would seem that everything is performing well. Congratulations are indeed in order. 🙂

Share

I don’t know how this keeps happening but there is always some neat tool or new feature that the folks over at Google roll out and completely blows me away! This time it is their tool out of the labs called Google Correlate. I’ve only just seen this (thanks to my buddy Homer Wolfe for posting on Facebook) and I’m already floored.

What I can gather is this lets you see how certain search terms are correlated over time, or location, or many other variables that I’m still exploring. The first example that I saw as a search term for Stalin (as in Joseph Stalin). As it turns out, no one searches for Stalin in the summer….weird right?!?!

Frequency of search terms for "Stalin"

As if this wansn’t enough to blow me away you can click on the link ‘Search by Drawing’ (http://correlate.googlelabs.com/draw) and pick to draw in your own frequency pattern and it will return a series of search terms that look like what you’ve drawn and tell you how correlated it is.

So I drew this:

What I drew (Something that peaks every couple of years

The correlation result

Really cool tool that gives all kinds of intuitive ways to search their search data! Since particle physics people are usually pretty geeky when it comes to how to search data I thought this would be a tool many people would like to play with!

Imagine the different implications to how searches could be done with this sort of manipulation of data at a visual and intuitive level. Now granted, most of the searches that go on in particle physics are of this sort (looking for bumps or strong correlation in the data) but the tools that Google is making to allow this to be more intuitive is really remarkable!

Share

Editor’s Note:  Fermilab is getting ready for its annual meeting that draws together many of the 2,311 scientists across the U.S. and globe that work with Fermilab as well as staff physicists and engineers. While it could be a time of sadness and reflection with the Tevatron set to shutdown, physicists are finding that Fermilab still has a lot to offer in terms of exciting, ground-breaking science as Fermilab Director Pier Oddone outlines in his weekly column.

This article first appeared in Fermilab Today May 24.

The 44th edition of the Users’ Meeting will take place on June 1 and 2, and it should be very exciting. The Users’ Meeting is a well-established tradition at Fermilab. Every year it showcases results from the entire Fermilab experimental program, alongside discussions of the lab’s future program and presentations from government officials about policies applicable to particle physics. This year we are very fortunate to have the Secretary of Energy, Dr. Steven Chu, presenting the Meeting’s public lecture at 8 p.m. on June 2.

This year has a special edge as we approach the end of data collection at the Tevatron. This remarkable machine is achieving luminosities considered impossible decades ago with antiprotons — more than 4 x 1032 cm-2sec-1 instantaneous luminosity, with 11 femtobarns of accumulated luminosity recently celebrated.

The Tevatron’s two international collaborations CDF and DZero have many achievements of their own, including major discoveries that have established our Standard Model of particle physics. There is still juice left in the Tevatron and we may yet establish processes beyond the Standard Model if some of the collaborations’ recent results are confirmed. We also have hints of unexpected results in the neutrino sector, with neutrino oscillation data from MiniBooNE and MINOS.

Looking to the future, MINERvA is laying the foundation for understanding different nuclear targets, NOvA construction is proceeding well, and there are new proposals to extend MINOS running. The Dark Energy Survey is nearing completion, better detectors are in development for the Cryogenic Dark Matter Search, and the COUPP dark matter search is operating a small prototype at Sudbury and a larger 60 kg prototype in the NuMI tunnel. Pierre Auger continues to provide interesting results with ultra-high-energy cosmic rays. And the LHC is working splendidly and results are coming out at a fast pace.

We are also in a critical year for two long-term projects, LBNE and Project X. In addition to Project X’s broad Intensity Frontier physics program, it can serve as a foundation for a neutrino factory if one is needed to fully understand the physics of neutrinos. Looking even farther ahead, we are studying the feasibility of muon colliders as a path back to the Energy Frontier.
All this activity augurs a great Users’ Meeting next week.

–Pier Oddone, Fermilab director

Share
Runners taking part in the relay race

Runners taking part in the relay race

Recently CERN had its annual relay race. It’s a great change from a normal day of work (whatever that means!) and a chance to mix with people from all parts of the lab. There are teams who compete to run the circuit around the Meryin site, hoping to win prizes in all sorts of categories from best veterans (combined age of over 270) to best all-female team. There’s even a prize awarded using a random number generator. There’s a task force 12 experts who spend 3 months writing an algorithm to produce a truly random number. (Only kidding- someone draws a piece of paper from a box!) This is one of the few days in the year that brings the whole lab together, so there are plenty of stalls around for the different social and sporting clubs and even a band that performs.

I didn’t realize what struck me most about the event until a few days later though. Last year when I came to Geneva looking for an appartement I found somewhere nice within a couple of days and had a spare day to explore Geneva and CERN. That day happened to be the CERN relay race, and it took me by surprise! So that means I’ve been here for nearly a year already, and wow has that time flown by! What has happened since then? Well there have been some interesting results coming from the experiments, including the ATLAS di-jet asymmetry and the CMS ridge. We’ve taken loads of data (although this is just the tip of a very big iceberg!) both with proton collisions and heavy ion collisions. The race for the Higgs is now in full swing as we pass 300pb-1 of data. There has been a lot of media attention, the Universe of Particles exhibit opened, the tram arrived and the main areas of CERN are being completely remodeled.

The awards ceremony

The awards ceremony

Personally things have changed a great deal as well. My early days on the experiment consisted mainly of working for two masters (ATLAS at CERN and BaBar at SLAC) which meant very long days in building 40 for the first couple of months. Since then my knowledge of ATLAS has increased steadily, with a search for the exotic charged Higgs boson which allowed me to learn about jets, tau leptons and missing energy at ATLAS. In parallel to that I’ve worked on the trigger which has been an ongoing challenge, and forced me to get to grips with nearly all the main parts of ATLAS software, from writing my own modules to defining a new ntuple transformation (going from one format of data to a new format.) It’s rather rewarding and reassuring to be able to define a completely new data format like that!

Of course this is by no means everything that happens to physicists at CERN! I’ve traveled all the way to Beijing (and back again on the trans-Mongolian railway!) to present my work, and our analysis on the charged Higgs has also been published. I’ve mentored a grad student, given a class on computer programming, and in my spare time I’ve set up and organized the LGBT group at CERN. As the workload started to ease off, I decided to take up blogging for US LHC Blogs, and that’s been a great way to meet people! This blog is one of the best ways to get a sense of what is happening at CERN and what the latest news is. There’s certainly a very active grapevine in high energy physics, and rumors are constantly circulating. It’s hard to know what’s what, so joining this blog is a refreshing change and gives all kinds of insights that aren’t found elsewhere.

So things have come a long way in a year, and there far too much has happened to fit into a single blog post. It feels like a being running just to keep up with everything that has happened! But despite all that’s changed the relay race still managed to take me by complete surprise, again. Next year I’ll be ready for it…

Share

Balance…and Greatness

Tuesday, May 24th, 2011

–by Josie Farrell, CHRP, Manager of Employee HR Services

During a recent trip to the UK, I was able to visit Beachy Head, a very dramatic chalk cliff on the south coast. The spot reminded me of the incredible impact that our own personal surroundings can have on our well-being. The chalk cliffs are the highest sea cliffs in Britain, and they are pristine and open with no boundaries or fences. Sitting at the cliff’s edge and looking out at the expansive English Channel really focused my mind. It was like my own personal retreat. (Did I mention I did not have my laptop?)

How easy it is to just accept all the clutter in our lives. Perhaps working in Human Resources make me more attuned to this fuzzy stuff, but let’s face it, most of us spend most of our time trying to survive the chaos in our daily lives –– commuting, traffic, work demands, dealing with vast amounts of e-mail, meetings, paper clutter and files, getting kids to and from school and pets to the vet—and don’t forget that significant other. You know what I mean. Our worlds can be chaos. Even physicists experience this. I know. They’ve told me so. They are human, too!

I work with many who try to juggle work and life issues to achieve some sort of balance while still focusing on their world-class research projects. Somehow, amazingly, they seem to do it. But like anything in life, there has to be moderation, there really must be balance. Each of you reading this has demands on your time.

Whenever our schedules become out of balance, our energy drops. I have read that lowered energy creates the illusion that there isn’t enough time in a day, so a vicious cycle of time limitation occurs. So why not create some simplicity in your life? You may not be able to escape and go sit at Beachy Head like I did, but you could clean some clutter from your work space or at home, or just turn your computer off. Create your own retreat for at least a moment or two in your day.

When asked how we should live our lives, Stephen Hawking replied: “We should seek the greatest value of our action.”

Share

Since deciding to become a high energy physicist I’ve had a much harder time answering a question often asked of scientists, “What’s the practical application.”  After all, High Energy Physics is, for the most part, a basic science; meaning its long term goals are to increase our understanding of the natural world.  Whereas in applied science (such as hydrogen fuel cell research) there is usually a targeted application from the get go (i.e. hydrogen powered automobiles).

When asked what’s the practical application of my research, I have a tough time answering.  After all, I study experimental Quantum Chromodynamics; and a “practical application” such as the light bulb (application of electromagnetism) or the transistor (quantum mechanics) may not arise in my lifetime.  But what I can say is the technologies developed to perform my research have a noticeable impact on our society (much like the benefits of the Space Program).

I thought today it might be interesting to talk about one such technology….namely the software used by high energy physicists.

Now each experiment at the LHC has its own unique software and computing environment (this is by design).  I can’t speak for the other experiments, but researchers within the CMS Collaboration have created something called CMSSW (or the CMS Software frameWork).  This software framework uses C++ plugins in a python based environment to analyze all experimental data taken by the CMS detector, and all simulated data created by the CMS Collaboration.  However, to use CMSSW (and the software of the other LHC experiments) you must be a member of the collaboration.

But rather then discussing CMSSW, I would like to discuss something common to all LHC experiments (and available to the general public), ROOT.  It is this “practical application” that I’d like to bring your attention to.

(Readers less experienced with programming languages may want to see the “Coding” section of one of my older posts for some background info).

 

What is ROOT?

ROOT is a object oriented software framework that uses a C++ interpreter to write scripts/macros for data analysis.  There are many pre-defined classes and methods available in ROOT; these are designed to enable a user to quickly & efficiently access large amounts of data, and perform analysis.  ROOT has both a command line interface and a graphical user interface, so modifications can be made either “on the fly” or by re-running a script/macro.

ROOT is very powerful, and it is possible to incorporate other libraries (such as the C++ Standard Template Library & others) into ROOT scripts/programs.

But, programming jargon aside, what can you actually do with ROOT?  Simple answer: lots.

ROOT is perfect for creating graphics, namely graphs & plots of interesting data.  But it can also be used to perform more useful tasks, such as numeric integration or differentiation.  ROOT also has several aspects from linear algebra built in (so you can do matrix multiplication/addition with it).  ROOT even enables a user to perform high level custom curve fits.

In fact, in some ways ROOT is very similar to programs like Mathematica & MATLAB.

However, ROOT has a distinct advantage over these products, its free.  ROOT can be downloaded by anyone; and has a rather detailed User’s Guide, and set of Tutorials/HowTo’s that can show new users how to perform a specific task.

But, enough boasting, let’s show some examples so you can get a feel for what ROOT can do!  I’m going to show some simple commands and their outputs, if you’d like to try them out yourself feel free.  My goal with this post is to get you interested in ROOT, not necessarily show you how to use it (guides such as that already exist! See links above!).

 

Example: Visualization

Suppose I was interested in observing the jet topology (or how the jets appear in space) in a particular proton-proton collision event.  There are several ways I could do this.  The first of which is to make what’s called a “lego plot.”  In a lego plot, I place the jet in space based on its angular coordinates; the polar angle, θ, and the azimuthal angle, Φ; and then each point is “weighted” by its momentum component in the xy-plane (termed pT).  To see how these angles & the xy-plane are defined in CMS, see the diagram below:

 

But in high energy physics θ is not very useful; instead we use a related variable called η, which is proportional to θ (η = 0 is still on the positive z-axis).

So in a lego plot I take all the jets in my event, and I plot them by their eta & phi values.  This is very simple to do in ROOT, and for this task I’m going to make a two dimensional histogram:

TH2D *LegoPlot = new TH2D(“LegoPlot”,”Jet Topology”);

LegoPlot->Fill( Jet.eta(), Jet.phi(), Jet.pt() );

Where the first line creates an instance of a two dimensional histogram object, and the second line stores the jet’s η, Φ, & pT as an (x,y) point; but let’s call this an (η,φ) point instead.  This is literally all I need to type.  Of course this is just for one jet, I could put the second line within a loop structure so that I could enter all my jets in my event.

To visualize this output, I simply need to type:

LegoPlot->Draw(“lego2”);

Where “lego2” is an option of the Draw command.  The output of this command is then:

 

Three Jet Event in CMS

 

Here ROOT will automatically open up a new window, and draw the plot for us…it even gave us some statistics regarding the plot (upper right corner).

And all this was done with one line of code!

But, unfortunately the plot isn’t labeled, so we can’t make sense of it quiet yet.  We could use the graphical interface to add a label, or we can use the command line approach.  The GUI is great, but if I have to make this plot over and over again from multiple data files; I’m going to get really tired of using the GUI each time.  So instead, I could use the command line interface and write a script to have ROOT do this for me.  The commands I would use are:

LegoPlot->SetXTitle(“#eta”);

LegoPlot->SetYTitle(“#phi (Radians)”);

LegoPlot->SetZTitle(“p_{T} (GeV/c)”);

Then upon running my script ROOT would automatically add these titles to the plot.

The use of “#” signs in the above lines let ROOT know that I don’t just want the axis to say “eta” but that I want the axis to display the symbol “η.”  The underscore with the {} brackets inform ROOT that I also want a subscript (superscripts are done with ^{ …text….} ).  So with a few lines of code in the ROOT framework I have not only stored data, but shown it graphically.

I never had to compile anything, and I didn’t need to spend time building my GUI!

The final plot result is shown here:

Thee Jet Event in CMS, with Labels!

 

But this η-Φ plot really hasn’t helped me visualize the jets in 3D; after all CMS is a giant Cylinder.  The above plot would be if I took a pair of scissors to the cylinder (starting at the x-axis) and cut down a line parallel to the z-axis.  This would then “un-roll” the cylinder into the flat plane above.

But what if I wanted to view this plot in actual “η-Φ” space?  Well ROOT can do that too, and in one line of code!

LegoPlot->Draw(“psrlego2”)

The familiar “lego2” is still there, but now I’ve added “psr” to the options of the draw command.  ROOT understands psr to mean 3D pseudorapidity coordinates.  The output of this options is shown here:

 

Three Jet Event in CMS, in eta-phi space
Again, in a simple command I’ve been able to do some very intense plotting.  Of course these are just a few brief examples.  I am by no means trying to give an all inclusive guide to how to use ROOT.  As I’ve mentioned, those already exist (see the user’s guide, tutorials & how-to’s I’ve linked above).
d
c

Example: Curve Fitting

I think one of the most challenging things in all of science is curve fitting.  The reason I believe it is challenging is two-fold: first, you have to know what kind of curves would do well in describing your data; second, curve-fitting software is usually very expensive!

However, as I mentioned, ROOT is free!  And can perform very powerful curve-fitting techniques very simply.

Suppose I’ve made a histogram of an observable, and kept track of the number of counts per each value of my observable (this is my measurement).  Let’s say it looks like this:

 

Example Measurement

Now let’s say I’m interested fitting a curve to this data.  Ordinary office programs such as Open Office Spreadsheet or Microsoft Excel have the ability to do simple fits such as polynomials, or simple exponentials.  But beyond a correlation coefficient, I’m not going to get much out of a fit from one of those programs.  I also don’t really get much functionality from them either.

Let me elaborate further on that part.  The above graph, it has a horizontal asymptote at one.  Let’s say I want to incorporate this behavior into my fit.  Well I happen to know that the function:

Has this asymptotic behavior.  This is a relatively simple function, but I couldn’t use the “out-of-the box” Microsoft Excel for this fit.

But, the above function is just to simplistic, it doesn’t allow for any “shifts” or changes in the data from that expression.  Instead, let’s toss in a few parameters, called A & B; these parameters will give us some more flexibility in our fitting procedure:

This is again simplistic, but staying simple is usually a good rule of thumb in science.

But we’ve settled on a function to fit to our data.  How do we implement it in ROOT?  Again, it is very simplistic, we use the function class already available in the ROOT framework:

TF1 *func = new TF1(“func”,”1.0 – exp( [0] * x [1] )”, 0, 40);

Here, I’ve setup a new function.  The first word in quotes is my function’s name, “func.”  The second set of quotes is the mathematical expression I want the function to use; with [0] and [1] being our parameters A & B.  Then the last two numbers are the range of the x-variable that the function will be defined for.

This should immediately illustrate the power of ROOT.  In one line, I can tell ROOT symbolically what mathematical expression I want it to use for fitting.  I can construct any function imaginable, with any number of parameters, just by typing it out to ROOT.  ROOT will even recognize trigonometric functions, along with others.  I can even construct numeric functions (but this takes more code then one line).

Now to perform the fit I just tell the histogram above (call it “Histo”) that I want to fit a function to it.  This is done by:

Histo->Fit(func,””,””,3,40);

The quotes in the above expression tell ROOT how to perform the fit.  Right now there’s nothing in the quotes, so ROOT will just use its default fitting method (chi-squared minimization), in the range of x equals 3 to 40.

Executing this command causes ROOT to perform the fit and spit back the values for my parameters A & B along with their errors:

 

Fit Output

Here the parameters [0] and [1] are labeled as “p0” and “p1.”  There is a column for their values (“VALUE”), and a column for their errors (“ERROR”).  Up at the top I can see that the fit converged, and that ROOT took 86 attempts/iterations in its fitting process.

The “Histo->Fit….” command will also plot the original histogram with the fit overlaid, as shown here:

 

Result of Fit

 

ROOT has also the fit parameters in the statistics box.  From the Χ2/ndf we see that the fit wasn’t a very good fit mathematically; but we weren’t really trying here either.  With a better fit function, and selecting a more advanced fitting procedure we can get Χ2/ndf ~ 1.0 (exactly what we want to have!).

 

In Closing

My goal with this post was to illustrate a product that has come about because of High Energy Physics research, and show that it could be beneficial for the rest of society.  Hopefully this will spark your interest in ROOT for science/engineering/mathematics applications.  There is an extensive ROOT community and support system that you may turn to if you decide to learn ROOT and encounter problems/questions.

I would highly recommend ROOT for any of our readers who are students with technical majors (at all levels).

 

Until next time,

-Brian

Share

Fermilab theoretical physicist Paul Mackenzie, spokesperson for the USQCD collaboration. Click on image for higher resolution version. Photo credit: Reidar Hahn.

The field of high-energy physics has always considered itself a family. To address some of the largest questions, such as how were we and the universe formed, it takes building-sized machines, enormous computing power and more resources than one nation can muster. This necessary collaboration has forged strong bonds among physicists and engineers across the globe.

So naturally when March 11 a tsunami and series of earthquakes struck Japan, home to one of the world’s largest high-energy physics laboratories and an accelerator research center, physicists in the U.S. started asking how they could help. It turns out that they have a unique resource to offer: computer power.

Lattice Quantum Chromodynamics (QCD)is a computational technique used to study the interactions of quarks and gluons and requires vast computing power. To help the Japanese continue this analysis, Fermilab and other U.S. labs will share their Lattice QCD computing resources.

 “We’re very happy that the shared use of our resources can allow our Japanese colleagues to continue their research during a time of crisis,” said Fermilab theoretical physicist Paul Mackenzie, spokesperson for the USQCD collaboration.

From now until the end of 2011, while computing facilities in eastern Japan face continuing electricity shortages, a percentage of the computing power at Brookhaven National Laboratory on Long Island, Fermi National Accelerator Laboratory near Chicago and Thomas Jefferson National Accelerator Facility in Virginia will be made available to the Japanese Lattice Quantum Chromodynamics (QCD) community.

“We appreciate the support from the U.S. QCD community,” said University of Tsukuba Vice President Akira Ukawa, spokesperson of the Japanese Lattice QCD community. “The sharing of resources will not only be instrumental to continue research in Japan through the current crisis, but will also mark a significant step in strengthening the international collaboration for progress in our field.”

Read the Fermilab press release here: http://www.fnal.gov/pub/presspass/press_releases/2011/USQCDrelease_052311.html

Related news:

 Japanese helped foreign scientists during quake

Japanese earthquake jolts Tevatron, emotions

Damage caused by the recent earthquake and recovery prospects

Share